Posts

Post marked as unsolved
137 Views

How to detect the "Change Scene" action from Reality Composer in Xcode

I have a Reality Composer project that has two scenes. In the first scene, I post a notification from Xcode to trigger an animation and then change the scene to the next scene. This is done using the Change Scene in the Action Sequence. In the second scene, when the user taps on an object, it should send a notification to Xcode using the Notify action. How do I set the .onAction in Xcode for the second scene if I switch to the second scene using an Action Sequence from the first scene? Thanks!
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
209 Views

Style Transfer Model Not Training

Hi! I am attempting to create a style transfer model for images. My content image folder has about 80,000 images. However, when I click train, the Create ML app displayed the “Processing” message for about a minute before I get the error: Could not create buffer with format ‘BGRA’ (-6662). Hope someone can help me with what this issue is! Is my training data too large or are their pictures in there that don’t conform to the right image format? Thank you!
Asked
by SAIK1065.
Last updated .
Post marked as solved
183 Views

How would one detect multiple devices with Nearby Interaction?

The sample code provided only allows a connection of up to one other device. I was wondering how would one create multiple sessions in Swift, how to handle different devices and their inputs, and how many sessions is the U1 chip capable of handling. Thank you!
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
240 Views

Can On-Device Personalization Models be used with Cloud Deployment of ML Models?

I have a custom Core ML model built with Keras that has a couple of updatable layers. I was wondering if I switched this model to be deployed from the cloud rather than packaged with the app, would the on-device personalized changes transfer whenever I deploy a model over the cloud or would a user have to start with a fresh, new model every time? Similarly, do on-device personalized models work with model encryption? Thanks!
Asked
by SAIK1065.
Last updated .
Post marked as Apple Recommended
218 Views

How to navigate from Widgets

When I tap on a widget, it goes directly to my app. However, I am wondering how do I deep link to various screens in my app from a widget. Is there some sort of NSUserActivity I can read?
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
952 Views

Adding too many subviews to iPad Swift Playgrounds gives me error, “There was a problem encountered while running this playground. Check your code for mistakes”

Hi! So I’m creating my playground on iPad. Just a simple playground, not a playground book so the file, `Contents.swift` has multiple classes. However, in each ViewController class, I can add no more than 7 UIViews (including UIImageViews, UIButtons, UILabels, etc.) before my playground gives me an error: “There was a problem encountered while running this playground. Check your code for mistakes”. This is really vague and there is nothing wrong with my code. If I remove the line:self.view.addSubview(someView)then everything works as expected. How can I work around this?
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
895 Views

Emojis in ARKit?

Hi guys! So I have a question about the usage of emojis in our playground. I know that we are allowed to use them as text but not as images. However, emojis don’t show up as SCNText. I have a function which transforms any string into an image (below) and setting the material of a node to that image works. However, I’m confused as to whether this is allowed or not since it uses the emoji as an image. If this isn’t allowed, how can I add an emoji as a text to my ARSCN scene?func emojiToImage() -> UIImage? { let size = CGSize(width: 30, height: 35) UIGraphicsBeginImageContextWithOptions(size, false, 0) UIColor.clear.set() let rect = CGRect(origin: CGPoint(), size: size) UIRectFill(CGRect(origin: CGPoint(), size: size)) (self as NSString).draw(in: rect, withAttributes: [NSAttributedStringKey.font: UIFont.systemFont(ofSize: 30)]) let image = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() return image }
Asked
by SAIK1065.
Last updated .
Post marked as solved
348 Views

How to add an image to an ARSCNScene?

Hi all! Hope you're making awesome strides in your playgrounds. So I stumbled upon a roadblock with the development of my playground. I want to add an image to my ARSCNScene. Not as a background, but as an actual node with position and all. Does anyone know whether this is possible? I couldn't find anything online. Thanks!
Asked
by SAIK1065.
Last updated .
Post marked as solved
3.0k Views

How to add CoreML on iOS Swift Playgrounds?

Hello! I am trying to include CoreML into my playground with the use of the camera. However, it seems that I cannot access my CoreML file from the Vision framework on iOS Swift Playgrounds. Would I have to use the Playground Book for this?
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
222 Views

How to convert following data model into Core ML model? Please Help!!!

So I have been following this SMS Spam Detection tutorial on this link: http://radimrehurek.com/data_science_python/. I get the data model and everything works perfectly. What I am not understanding is how to convert this model into a Core ML model. The documentation does not help as they use data model already built into the library. I would appreciate if someone could follow this tutorial and tell me how I can convert the .pkl file into a Core ML model. Please post whatever code you got below. Thanks!
Asked
by SAIK1065.
Last updated .
Post marked as unsolved
1.5k Views

Core ML Supports Scikit-learn?

Hey everyone,So I have creted a classifier using scikit learn. However, the data model I get is a .pkl (Pickle) file. I am wondering if Core ML supports the conversion of .pkl files or only .csv files. Also, if Core ML does not support .pkl files, how can I convert my file into a .csv file? Thanks!EDIT: So basically I would like to know if I can convert .pkl files to a .mlmodel and how can I do so
Asked
by SAIK1065.
Last updated .
Post marked as solved
334 Views

Only part of the keyboard shown

When I was testing my playground in Xcode 8.2.1, the keyboard didn't show so I would use the Mac keyboard. After updating to 8.3 and following @pdm's advice on how to get interaction to work on it, only part of the keyboard shows. How do I hide the keyboard with code so you can only use the Mac hardware keyboard. And I need a keyboard for the UITextField.
Asked
by SAIK1065.
Last updated .