Downloading and Compiling a Model on the User's Device

Distribute Core ML models to the user's device after the app is installed.


Downloading and compiling models on the user's device – rather than bundling them with your app – is beneficial for specific use cases. If your app has a variety of features supported by different models, downloading the models individually in the background could save on bandwidth and storage, by not forcing the user to download every model. Likewise, different locales or regions might use different Core ML models. Or models might be tuned offline for users, with updates provided over the air.

Implement Downloading and Compiling in the Background

The model definition file (.mlmodel) must be on the device before it's compiled. Use URLSession, CloudKit, or another networking toolkit to download the model for your app onto the user's device.

Call compileModel(at:) to generate the .mlmodelc file used to initialize an instance of MLModel. The model has the same capabilities as a model bundled with the app.

Listing 1

Compiling a model file and creating an MLModel instance from the compiled version

let compiledUrl = try MLModel.compileModel(at: modelUrl)
let model = try MLModel(contentsOf: compiledUrl)

Move Reusable Models to a Permanent Location

To limit the use of bandwidth, avoid repeating the download and compile processes when possible. The model is compiled to a temporary location. If the compiled model can be reused, move it to a permanent location, such as your app's support directory.

Listing 2

Copying the .mlmodelc file into your app's support directory

// find the app support directory
let fileManager = FileManager.default
let appSupportDirectory = try! fileManager.url(for: .applicationSupportDirectory,
        in: .userDomainMask, appropriateFor: compiledUrl, create: true)
// create a permanent URL in the app support directory
let permanentUrl = appSupportDirectory.appendingPathComponent(compiledUrl.lastPathComponent)
do {
    // if the file exists, replace it. Otherwise, copy the file to the destination.
    if fileManager.fileExists(atPath: permanentUrl.absoluteString) {
        _ = try fileManager.replaceItemAt(permanentUrl, withItemAt: compiledUrl)
    } else {
        try fileManager.copyItem(at: compiledUrl, to: permanentUrl)
} catch {
    print("Error during copy: \(error.localizedDescription)")

See Also

Machine Learning Model

class MLModel

An encapsulation of all the details of your machine learning model.

Making Predictions with a Sequence of Inputs

Integrate a recurrent neural network model to process sequences of inputs.