Create ML

RSS for tag

Create machine learning models for use in your app using Create ML.

Create ML Documentation

Posts under Create ML tag

58 Posts
Sort by:
Post not yet marked as solved
1 Replies
94 Views
Hi from France ! I'm trying to create a model for dice detection. I've take about 100 photos of dice on the same side (1 point). Are-my bounding boxes good ? should I take the whole dice ? I launched the trainning, it seems to work well : Then in the Evaluation tab, the values seems not great but not bad : I/U 84% Varied I/U 44% The validation scope is very low : In the preview tab, no matter what image I give to it, I have no detection What am I missing ? What should I improve ?
Posted
by Cyril42.
Last updated
.
Post not yet marked as solved
0 Replies
81 Views
Hey there, I am new to developing in the Apple ecosystem, as well as a novice in the field of ML. I have dabbled a little bit in ML with face recognition in Python, but that's about it. The thing I've found is that there is quite a lot of training algorithms and such for computer vision related things, but for my use case I don't really know what I am looking for, from a ML perspective. Thing is, I have real time telemetry data where I have to determine the CURRENT (or near-current) state based on historic data (from NOW and BACKWARDS for 0 to something like 20 seconds, or more). The initial state is pretty easy to determine based on a few values, basically a UInt16 is set to 65535. Most of the possible states can be determined over time from the telemetry, but there are certain cases that can vary quite a lot. These cases are very likely possible to pick up as a human examining the logs, but it's something that seems hard or impossible to create a programmatic logic around. The sample rate in real time is pretty steady at 60 Hz. Basically, I'm curious about what kind of ML model would be suitable to train around this kind of data, and how? Generally, it shouldn't be a huge problem creating the training data. Even if it takes a while to manually mark up the various state changes, it is far from infeasible. Much of the data will differ wildly from dataset to dataset, while some will be very similar from one dataset to another. So, basically I would need a model that can take several datasets of telemetry data (including when the state changes), run it through the ML to train a model that can, using real time data for the last 0-10 seconds or so, determine what state we're in at the moment. In most cases it can even have a "delay" of a second or so. The state change itself is not time critical as such, but it should be able to determine state with a very high confidence as possible within the space of at most 3 seconds. As I understand it the Create ML app can be used for training a model, for use with the app in question. But as mentioned, I have no real idea what is most suitable for training a model with this kind of data that is supposed to analyze data over time. Tabular Regression, Recommendation? Something else entirely? I'm guessing that real time data in production code would be supplied as a dataset with the last 20-30 seconds of data as input to the model, rather than just the last packet of telemetry data. But this is just my assumption. I am attaching a text file with comma separated values from sample telemetry, somewhat truncated for brevity. There are a variety of fields, including coordinates in XYZ space, which have some relation to the state, but can vary wildly from one dataset to another (depending on location). But I assume that the training would automatically give those fields less weight? If anyone can point me in the right direction, I would be really grateful. The finished model will eventually be used in an iOS app that will display the real time telemetry data. Thanks in advance, /Peter sample-telemetry.txt
Posted
by Bornhall.
Last updated
.
Post not yet marked as solved
1 Replies
244 Views
Hi there, I train an object detection model in CreateML and want to deploy in Python. Then I can get coordination from the following codes but how to get the labels of prediction? import coremltools as ct from PIL import Image path = '/sample.jpeg' example_image = Image.open(path).resize((960, 128)) model_path = '/sample.mlmodel' model = ct.models.MLModel(model_path) out_dict = model.predict({'imagePath': example_image,}) out_dict
Posted
by anguslou.
Last updated
.
Post not yet marked as solved
1 Replies
211 Views
I would like to detect specific objects based on a text input for a coreml project. I am currently using ios 14 along with Xcode and using Apple developer object detection template linked here. For example, can I have a textbox with live capture in the background, and when you put a Banana into the textbox, it only detects bananas, and when you put apples, it only detects apples? Would I have to edit the [VNRecognizedObjectObservation] array? (Basically I just want to control what objects I can detect)
Posted
by Archie_S.
Last updated
.
Post not yet marked as solved
1 Replies
367 Views
What tools are folks using to create the json file needed to train a custom word tagger model? I've tried Doccano, but it exports JSONL which is very different than what CreateML is expecting. (example of the required format here: https://developer.apple.com/documentation/naturallanguage/creating_a_word_tagger_model). Are there standard tools or utilities that export/convert to the CreateML format? Thanks.
Posted
by Means.
Last updated
.
Post not yet marked as solved
0 Replies
184 Views
I would like to know whether 2019 WWDC sample codes can be accessed somehow. I was referring to the following Video Building Activity Classification Models in Create ML A working sample code is really useful to verify this capability. Any suggestions will be helpful.
Posted Last updated
.
Post not yet marked as solved
0 Replies
339 Views
Tensorflow metal not working on M1 Macbook Air(8GB RAM). #test.py import tensorflow as tf cifar = tf.keras.datasets.cifar100 (x_train, y_train), (x_test, y_test) = cifar.load_data() model = tf.keras.applications.ResNet50( include_top=True, weights=None, input_shape=(32, 32, 3), classes=100,) loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True) model.compile(optimizer="adam", loss=loss_fn, metrics=["accuracy"]) model.fit(x_train, y_train, epochs=5, batch_size=64) Here's the error log:- > python test.py Metal device set to: Apple M1 systemMemory: 8.00 GB maxCacheSize: 2.67 GB 2022-10-28 23:09:44.144540: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:305] Could not identify NUMA node of platform GPU ID 0, defaulting to 0. Your kernel may not have been built with NUMA support. 2022-10-28 23:09:44.144913: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:271] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 0 MB memory) -> physical PluggableDevice (device: 0, name: METAL, pci bus id: <undefined>) [1] 89863 bus error python test.py Another test on a custom code raises this error:- 1 leaked semaphore objects to clean up at shutdown
Posted
by udayjain.
Last updated
.
Post not yet marked as solved
0 Replies
218 Views
Hello! In my iPhone app, I have created a button so that I can retrain my CreateML Linear Regression: let model = try MLLinearRegressor.init(trainingData: trainingData.base, targetColumn: "Target") Now I want to use this model in the UI. I know that I can use the model above directly, like following: let pred = try model.predictions(from: testData) However, I prefer using .mlmodel files, because I want to receive one prediction at a time, instead of having to input a DataFrame. I have seen examples where people use the model.write-method to store the .mlmodel on the computer desktop for example, but I want to do it in runtime in the app. Is this possible? And in that case, how can I store the .mlfile and load it again? Can I do it in the app repository?
Posted
by miriamkw.
Last updated
.
Post not yet marked as solved
1 Replies
565 Views
HI, I have archived CoreML model and deployed it in CoreML Model deployment. If I try to access it in app, facing issue like "Failed to begin access for model collection with identifier". I have double checked the identifiers in my app and CoreML model deployment, both are same. Not sure what mistake I'm making, Thanks in advance. Please view my code used in Xcode as below: func asddasd() {       if #available(iOS 14.0, *) {         let progress = MLModelCollection.beginAccessing(identifier: "MainCoreMLCollection", completionHandler: modelCollectionAvailable)         debugPrint(progress)       } else {         // Fallback on earlier versions       }     }           @available(iOS 14.0, *)     func modelCollectionAvailable(result: Result<MLModelCollection, Error>) {       switch result {       case .success(let collection):         debugPrint("Model collection \(collection.identifier) is now available.")                   // Load a model from the collection.         loadModel("BCGSearchClassifier", from: collection)                 case .failure(let error):         debugPrint("Error accessing a model collection: \(error)")       }     }           @available(iOS 14.0, *)     func loadModel(_ modelName: String, from collection: MLModelCollection) {       debugPrint(collection.entries[modelName])       guard let entry = collection.entries[modelName] else {         debugPrint("Couldn't find model \(modelName) in \(collection.identifier).")         return       }       MLModel.load(contentsOf: entry.modelURL) { result in         switch result {         case .success(let modelFromCollection):           // Use the modelFromCollection instance.         debugPrint("success")           debugPrint(modelFromCollection)         case .failure(let error):           debugPrint("Error loading model \(modelName) in \(collection.identifier): \(error).")         }       }     } ERROR: MLModelCollection: namespace (TeamID.BCGCoreMLCollection) download failed. Error Domain=com.apple.trial Code=0 "Unknown error." UserInfo={NSLocalizedDescription=Unknown error.} Error Domain=com.apple.CoreML Code=10 \"Failed to begin access for model collection with identifier \'MainCoreMLCollection\': invalid identifier\" UserInfo={NSLocalizedDescription=Failed to begin access for model collection with identifier \'MainCoreMLCollection\': invalid identifier}"
Posted
by ragulml03.
Last updated
.
Post not yet marked as solved
20 Replies
3.1k Views
hello, When I used xcode to generate the model encryption key, an error was reported, the error was 'Failed to Generate Encryption Key and Sign in with you Apple ID in the Apple ID pane in System Preferences and retry '.But I have logged in my apple id in the system preferences, and this error still occurs.I reinstalled xcode and re-logged in to my apple id. This error still exists. Xcode Version 12.4 macOS Catalina 10.15.7 thanks
Posted
by lake-tang.
Last updated
.
Post not yet marked as solved
3 Replies
777 Views
Being brand new to create ML I tried to run my own ML project. When creating my own image classifier (same with tabular classification) I fail from the start. When selecting valid training data create ML says "Data Analysis stopped". I'm using Create ML Version 3.0 (78.7). Any suggestions?
Posted
by MarcoGMuc.
Last updated
.
Post not yet marked as solved
1 Replies
684 Views
Following the guide found here, I've been able to preview image classification in Create ML and Xcode. However, when I swap out the MobileNet model for my own and try running it as an app, images are not classified accurately. When I check the same images using my model in its Xcode preview tab, the guesses are accurate. I've tried changing this line to the different available options, but it doesn't seem to help: imageClassificationRequest.imageCropAndScaleOption = .centerCrop Does anyone know why a model would work well in preview but not while running in the app? Thanks in advance.
Posted Last updated
.
Post not yet marked as solved
1 Replies
277 Views
I was working in this csv file and header in located in 4th row so how to skip 3 rows in TabularData framework. Note that skiprow option in available in MLDataTable framework as well as in Python's Panda import Foundation import TabularData let options = CSVReadingOptions( hasHeaderRow: false, nilEncodings: ["","nil"], ignoresEmptyLines: true ) let dataPath = " https://www2.census.gov/programs-surveys/saipe/datasets/time-series/model-tables/irs.csv" var dataFrame = try! DataFrame(contentsOfCSVFile: URL(string: dataPath)!, rows: 0..<15, options: options) print (dataFrame.description)
Posted
by bhregmi.
Last updated
.
Post not yet marked as solved
1 Replies
339 Views
I used createML to generate a text-categorized mlmodel file. I want to import it into my app and let users use it. How can I pour in and write code? Please note: 1. My app was written with SwiftUI. 2. This createML mlmodel file is used for text classification.
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
1 Replies
631 Views
I'm attempting to build a Create ML Image Classification model with 60k images and 4k classes. Each time I run it it manages to extract all the image features and run over 8 iterations but then when I come back to it it has stopped with "Unexpected Error". I ran it on a MacBook Air 2020 (Intel chip) a few times and then purchased a Mac mini M1 to try it on (256GB, 8GB ram - 160 GB free space), but the same happens. Are there any logs with more details of the errors? "Unexpected Error" doesn't give me anything to go on. Thanks.
Posted
by TKnapp.
Last updated
.
Post not yet marked as solved
1 Replies
385 Views
I'm looking for a way to easily (or more easy than rewriting a time series data framework) deal with stock market data. I apparently need to preprocess much of the data I could get from typical APIs (Finnhub.io, AlphaVantage.co) to remove the weekend days from the datasets. Problem: When using the awesome NEW Charts framework to plot prices by daily close price - I get weekends and holidays in my charts. No "real" stock charting tool does this. They some how remove the non-market days from their charts. How? Researching I found the Python Pandas library for TimeSeries data... Can Apple's TabularData do this TimeSeries data manipulation for me? Can to share an example? Thanks! David
Posted Last updated
.
Post not yet marked as solved
2 Replies
677 Views
I encountered an error while experimenting with the new CreateMLComponents in playground with the following code: import CreateMLComponents import CoreML var fullyConnected = FullyConnectedNetworkRegressor<Float>.init() fullyConnected.hiddenUnitCounts = [2] let feature: AnnotatedFeature<MLShapedArray<Float>, Float> = .init(feature: .init(scalars: [2, 3], shape: [2]), annotation: 5) let fitted = try? await fullyConnected.fitted(to: [feature, feature]) print(fitted) The generated error message is included (partially) at the end of this post. I later found out that this same code works fine in an actual app. Any insights? The error message: Playground execution failed: error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0). The process has been left at the point where it was interrupted, use "thread return -x" to return to the state before expression evaluation. * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)   * frame #0: 0x00007ffb2093728d SwiftNN`SwiftNN.Tensor.scalar<œÑ_0_0 where œÑ_0_0: SwiftNN.TensorScalar>(as: œÑ_0_0.Type) -> œÑ_0_0 + 157     frame #1: 0x00007ffb20937cbb SwiftNN`SwiftNN.Tensor.playgroundDescription.getter : Any + 91     frame #2: 0x000000010da43ac4 PlaygroundLogger`___lldb_unnamed_symbol491 + 820     frame #3: 0x000000010da45dbd PlaygroundLogger`___lldb_unnamed_symbol505 + 189 (some more lines ...) PlaygroundLogger`___lldb_unnamed_symbol428 + 124     frame #65: 0x000000010da41dad PlaygroundLogger`playground_log_hidden + 269     frame #66: 0x000000010ca59aba $__lldb_expr14`async_MainTY1_ at CreateMLComp.xcplaygroundpage:12:5     frame #67: 0x000000010ca59fb0 $__lldb_expr14`thunk for @escaping @convention(thin) @async () -> () at <compiler-generated>:0     frame #68: 0x000000010ca5a0c0 $__lldb_expr14`partial apply for thunk for @escaping @convention(thin) @async () -> () at <compiler-generated>:0
Posted
by Alan_Z.
Last updated
.