Posts

Post marked as solved
1 Replies
0 Views
!! SOLVED !! It turns out that preparing an application for speech recognition using live audio from the microphone is a two step process. The first step is well documented and involves adding a key to the app's info.plist document. The following screenshot shows the entry that is needed in the app's project file. To add a key via the UI, mouse-over any key, and click the plus icon that shows to the right of that key. You can then choose the needed key from the drop-down list that appears next. After choosing the key, you will need to add some descriptive text that will show in the permissions request window when the app is first launched on the computer/device. The second step in this process is not well documented. I stumbled upon it accidentally while researching something for a different matter. Evidently, MacOS apps run in a sandbox and need to register which capabilities they want to use. The following screenshot is found in the Signing & Capabilities tab of the app's project file. Look for the section labelled 'App Sandbox'. If it isn't there, you may need to add it via the +Capability button found just below the tabs. Look for Audio Input in the 'Hardware' group and ensure that the checkbox is checked.
Post not yet marked as solved
1 Replies
0 Views
Generally speaking, your shapeNode conversion should work. It's possible that you are experiencing either a layering problem (triangles are behind the main background color - check the zPosition on your spriteNodes), or the conversion of the shapeNode to a texture isn't working as you would expect. I ran into similar problems with shapeNodes in my game. One thing you could do to troubleshoot is to use Playgrounds to build and convert a shapeNode to a texture, then add it to a scene in the Playground. If that doesn't work (just like in your app), then you have a nice laboratory where you can try different things to figure out what is causing the problem. If I recall correctly, one of the problems that I ran into had to do with positioning of a shapeNode and/or its converted texture. There might be other valid reasons why your triangles aren't showing, but the Playground approach should help you explore various possibilities.
Post not yet marked as solved
3 Replies
0 Views
Even if you could prevent the device from making a screenshot, how would you prevent someone from using a regular camera, or other mobile device with a camera, from taking a picture of what is showing in your app? Perhaps some insights into why you need to prevent users from making screenshots would help generate ideas for how to address it.
Post not yet marked as solved
8 Replies
0 Views
Confirmed. After updating to MacOS 10.15.1 and uprading XCode to 11.2, my physics body is generated correctly.
Post not yet marked as solved
8 Replies
0 Views
I filed a bug (about a week ago) with example code to demonstrate the problem. Anyone know how long it takes for Apple to respond to bugs?NOTE: I used the 'Feedback Assistant' as I don't have a paid developer account, which is needed to file a bug via ITS.
Post marked as solved
2 Replies
0 Views
Do you just see these problems when running in XCode? Do the same problems exist if you execute the built app from the product build directory?
Post not yet marked as solved
8 Replies
0 Views
I'm having the same issue. It appears to be linked to an update to Cataline (on Mac OS) and iOS 13. Specifically, when using a texture to build the physics body. When a shape is used (i.e. a circle with radius X), then the physics body is create.Anyone know if a bug has been created for this, yet?
Post marked as solved
3 Replies
0 Views
Thanks for the links. They were helpful, but not complete.I've got an AVRoutePickerView on my screen, and I added an observer to the NotificationCenter (as directed in the documentation). However, when I choose the AppleTV from the view picker, I get the following error message:MPMediaControlsRemoteViewController Dismissing because view service terminatedI was unable to find anything in a Google search regarding a cause/solution for that message.However, if I first mirror my iPhone to the AppleTV before launching my app, the observer method does get fired, and I can see the 2nd screen. That's definitely further than I had gotten before, but I'd like to know how to get the AVRoutePickerView observer to work properly. Here are the pertinent parts of my code (a SpriteKit game), in case it is needed for reference:GameViewController:override func viewDidLoad() { super.viewDidLoad() let notifier = NotificationCenter.default notifier.addObserver(self, selector: #selector(GameViewController.screenDidConnect), name: NSNotification.Name.UIScreenDidConnect, object: nil) notifier.addObserver(self, selector: #selector(GameViewController.screenDidDisconnect), name: NSNotification.Name.UIScreenDidDisconnect, object: nil) let skView = self.view as! SKView let scene = TitleScene(size: (skView.bounds.size)!) scene.backgroundColor = SKColor.black skView.presentScene(scene) }@objc fileprivate func screenDidConnect(notification: NSNotification) { print("GameViewController: Screen was connected: \n\(notification)") print("# of Screens: \(UIScreen.screens.count)") } @objc fileprivate func screenDidDisconnect(notification: NSNotification) { print("GameViewController: Screen was disconnected: \n\(notification)") print("# of Screens: \(UIScreen.screens.count)") }Any ideas as to what I'm missing?
Post not yet marked as solved
6 Replies
0 Views
I wasn't able to find anything native in SpriteKit, either. I did find code written by Craig Grummitt that works really well. Hopefully it will work for you, too.
Post marked as solved
2 Replies
0 Views
I finally got the code to work. Posting here in case anyone else comes upon this and needs a similar solution.I changed 2 items. 1) When comparing screen widths to the initial view, I used 'bounds' instead of 'frame'. 2) I turned off the 'Hello World' label fade in animation (the text stayed transparent for some reason). func checkForExternalDisplay() { let screens = UIScreen.screens if screens.count > 1 { let mainScreenIndex = self.view.bounds.width == screens[0].bounds.width ? 0 : 1 let otherScreenIndex = self.view.bounds.width == screens[0].bounds.width ? 1 : 0 let primaryScreenIndex = screens[0].bounds.width >= screens[1].bounds.width ? 0 : 1 let otherWindow = UIWindow(frame: screens[otherScreenIndex].bounds) otherWindow.rootViewController = UIViewController() otherWindow.screen = screens[otherScreenIndex] let windowRectangle = CGRect(x: 0, y: 0, width: screens[otherScreenIndex].bounds.width, height: screens[otherScreenIndex].bounds.height) let otherView = SKView(frame: windowRectangle) otherWindow.addSubview(otherView) otherWindow.makeKeyAndVisible() otherWindow.isHidden = false if primaryScreenIndex == mainScreenIndex { primaryView = self.view as? SKView } else { primaryView = otherView } } else { primaryView = self.view as? SKView } primaryView?.backgroundColor = SKColor.green }(code from GameScene.swift) override func didMove(to view: SKView) { // Get label node from scene and store it for use later self.label = self.childNode(withName: "//helloLabel") as? SKLabelNode // if let label = self.label { // label.alpha = 0.0 // label.run(SKAction.fadeIn(withDuration: 2.0)) // } // Create shape node to use during mouse interaction let w = (self.size.width + self.size.height) * 0.05 self.spinnyNode = SKShapeNode.init(rectOf: CGSize.init(width: w, height: w), cornerRadius: w * 0.3) if let spinnyNode = self.spinnyNode { spinnyNode.lineWidth = 2.5 spinnyNode.run(SKAction.repeatForever(SKAction.rotate(byAngle: CGFloat(Double.pi), duration: 1))) spinnyNode.run(SKAction.sequence([SKAction.wait(forDuration: 0.5), SKAction.fadeOut(withDuration: 0.5), SKAction.removeFromParent()])) } }