Hello,
I've been working to understand the nore coremltools for converting Machine Learning models to .mlmodel files for use in my project, but am not finding much success. For reference, I've been using Yahoo's Open NSFW model, which provides a .caffemodel and .prototxt file, suitable for conversion to .mlmodel. The conversion process worked without issue by running this command;
coreml_model = coremltools.converters.caffe.convert(('resnet_50_1by2_nsfw.caffemodel', 'deploy.prototxt'), image_input_names = 'data')
Once I saved out my .mlmodel, I brought the file into my Xcode project and set up my iOS project like so;
override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(true) // Perform analysis newDetect() } func newDetect() { do { let model = try VNCoreMLModel(for: open_nsfw().model) let request = VNCoreMLRequest(model: model, completionHandler: handleResults) let myImage = CIImage(image: UIImage(named: "testImage") let handler = VNImageRequestHandler(ciImage: myImage!) try handler.perform([request]) } catch { print(error) } } func handleResults(request: VNRequest, error: Error?) { guard let results = request.results as? [VNCoreMLFeatureValueObservation] else { fatalError("An error has occurred.") } print(results) }
The goal here is to take my UIImage, which lives in the bundle, pass it through the model, and receive a "probability" score in handleResults(just to print the results would suitable). Upon running the app, I never end up with any results (rather, I receive an empty dicitonary in my console.
Have I done something wrong in this process?