Post not yet marked as solved
I'm looking into utilizing a cloud Linux machine to create and train my machine learning drawing classifier model with Turi Create. If I use my MacBook Pro it'll take at least 55 hours, so I want to offload that work and speed it up dramatically allowing me to iterate on the model to get the best result.
Now there's a lot of options for RAM, CPU, and GPU capabilities. What will Turi Create take advantage of? For example, if I run it on a machine with 96 CPU cores will it utilize them all and possibly speed it up to minutes rather than days, or would a better GPU be preferred? How much RAM would be good? I'll be training it to recognize 6500 classes with 20 images of each. Thanks!
Post not yet marked as solved
I'm receiving this fatal exception in iOS 13 beta 1 when using my Core ML image classification modelMTLDebugValidateMTLPixelFormat, line 1388: error 'pixelFormat (11) is not a valid MTLPixelFormat.'MTLDebugValidateMTLPixelFormat:1388: failed assertion `pixelFormat (11) is not a valid MTLPixelFormat.' guard let drawingImage = canvasView.currentDrawing.rasterized()?.cgImage,
let model = try? VNCoreMLModel(for: Symbols().model)
else { return }
let classificationRequest = VNCoreMLRequest(model: model) { [weak self] (request, error) in
self?.processClassifications(for: request, error: error)
}
classificationRequest.imageCropAndScaleOption = .centerCrop
let handler = VNImageRequestHandler(cgImage: drawingImage)
try? handler.perform([classificationRequest]) //crashes here
Post not yet marked as solved
In iOS 13 users can select their preferred language for your app in Settings. It appears this only shows up if your app supports more than one language and only lists the languages your app supports, which makes sense.I have a kind of unique situation in my app. It is not localized for any language except English, however the app allows users to work with dates formatted in their preferred language's format. This is currently implemented vialet dateFormatter = DateFormatter()
dateFormatter.locale = Locale(identifier: Locale.preferredLanguages.first!)
dateFormatter.calendar = .currentLocale.preferredLanguages lists the user's preferred languages as specified in Settings which satisfies most users. But some have requested I add a language selector in the app because sometimes they don't want to use their most preferred locale. So it sounded like the iOS 13 app-specific language selection would be great for me! But it seems that's not the case. Any suggestions on what would be best to do in this scenario?
Post not yet marked as solved
If you’re classifying images with Create ML and do not specify a maximum number of iterations, I understand it will iterate until it finds an optimal solution. But if it never does, how many iterations will be made before it stops, at a max? Does it stop when the accuracies stop improving? If they continue to improve even slightly at what point will it say there’s been too many iterations and stop? I’m currently at 16, wondering it if could go to 25 or what I should expect.
In the Vision with Core ML session at WWDC the Vision FeaturePrint pretrained model was discussed. The presenter stated it is capable of predicting over 1000 categories. In my app I have 6000 categories to classify. Does this mean I cannot use Create ML to classify more than 1000 categories, or what will occur if I try to?
Post not yet marked as solved
Labeling images with Unicode symbols results in corrupted class titles. For example, something like “⚀Unicode” may become something like "‚òÄÔ∏éUnicode".
In beta 1 I was able to create an image classifier model using a subset of my data. In beta 2 I’m now seeing an error:MLCreateErrorDetected inf/nan values in feature(s) \'__image_feature(s)__\', \'__image_feature(s)__\', (repeated many times)Cannot proceed with model training.What is the issue here?
Post not yet marked as solved
I have purchased a fancy new .app domain and now need to move my app's web content to this new domain. I have already successfully implemented Universal Links support, Handoff support, and Web Markup on the current domain. I have the apple-app-site-association file hosted at root and support deep links with the Smart App Banner. The app is live on the App Store that links up content in the app to web content via URL. All of this has been working well for a long while. Now I need to move to a new domain and I want to ensure I do this correctly and avoid messing it all up.What exactly do I need to do?I'm thinking:1) Copy over all the web resources to the new domain - don't move it to ensure links from the existing app version continue to work - and change all the URLs to the new domain including the app-arguments for the smart banners2) Copy over the apple-app-site-association file to the new domain and change the paths in applinks details3) Test links at the new domain using the App Search API Validation Tool4) Update all the links in the app to the new domain - NSUserActivity webpageURLs and associated domains (applinks and activitycontinuation)5) In iTunes Connect, specify the new marketing URL to ensure Applebot will index the new domain6) Release the app update7) At some point in the future, delete the web content and apple-app-site-association at the old domainHave I missed anything? Any tips or suggestions?Will it be okay to leave the web files in place at the old domain or will that cause trouble? I wondered if I should configure redirects to point to the new domain but I figured that may not work anyways - I believe I read it doesn't follow any redirects.Thanks!
Post not yet marked as solved
In an app that allows you to edit photos after viewing them in a grid view, such as the Example app using Photos framework, there's a problem where the grid view shows an old version of a photo after it was edited.This is because when you request the image right after it was edited [using requestImage(for:targetSize:contentMode:options:resultHandler:) after photoLibraryDidChange(_ changeInstance: PHChange)], this API returns a result image that does not contain the edit just made - it returns the previous version of that photo before the edit was performed.Any workarounds or when could we expect a resolution? rdar://38123279
When needsInputModeSwitchKey is false, your custom keyboard is supposed to hide the 'switch keyboard' button. This is false on iPhone X because the globe button is provided by the system. However it is also false when running an app in compatibility mode - not optimized for iPhone X - and here it falls back to the non-iPhone X keyboard inside the compatibility window. This results in there being no globe key visible, it's impossible to switch keyboards.I filed this bug report: rdar://35549038Are there any workarounds? Any suggestions, other than show the duplicate globe button for iPhone X until this issue is addressed?The issue continues to occur with iOS 11.2 beta 3. When can we expect this to be fixed?
Post not yet marked as solved
I have adopted Universal Links in my app. The App Search API Validation Tool is passing all validations, including Link to Application.But when I open the web page in Safari on my Mac and open App Switcher on my iPhone, it only shows Safari to handoff, when it should have instead shown my app (like what happens with the Trello app and website). Similarly, when I'm in my app and I've made an NSUserActivity current with the webpage URL included, Safari does not come up as a handoff suggestion on my MacBook.Now if I send the link to the webpage to myself and tap on it, it properly opens the app instead of Safari. But it doesn't provide the option to open it in Safari in the status bar at the top right, but maybe that's a change in iOS 11, so that's fine.Why isn't Handoff working properly here?