Engage with the WWDC20 community and connect with Apple engineers during the conference.

Posts under WWDC20 tag

26 Posts
Sort by:
Post marked as solved
3 Replies
4.6k Views
There is LazyVStack to work with a large amount of content inside a ScrollView and ensure good scrolling performance. I like to work with List in my apps as it provides some more functionality than just a ScrollView with a stack of content inside. Does List in SwiftUI now also use lazy loading of content elements in iOS 14? If it does not support this by default is there any way to make it work with lazy loaded content? I tried this code but it caused rendering issues: List { LazyVStack { Text("1") // [...] hundreds of Text views for testing } }
Posted
by
Post not yet marked as solved
4 Replies
1.7k Views
Hello, Our use case is Screen sharing in a live video call. We use broadcast extension to capture screens and send frames. The broadcast extension has hard limit of 50MB. The screen sharing works great with iPhones. But on iPad, ReplayKit delivers larger screens, and as a result, the extension memory usages goes beyond 50MB. While using the profiler we noticed, the memory used by our code is <25MB, but on iPad ReplayKit is having memory spikes which causes memory to go beyond 50MB limits. How should I achieve screen sharing use case on iPads? What is the guideline. Any suggestion/help is appreciated. Best, Piyush
Posted
by
Post not yet marked as solved
11 Replies
4.2k Views
Hey, I have an app I've been experimenting with on iOS 13 that uses MultiPeerConnectivity. After upgrading to iOS 14 on one of my devices and using Xcode 12 this app no longer works. As outlined in "Support local network privacy in your app" I've added a description with NSLocalNetworkUsageDescription in Info.plist. Since the app uses Bonjour advertisting and browsing via MCNearbyServiceAdvertiser and MCNearbyServiceBrowser I've also added the service name via NSBonjourServices to Info.plist. It's my understanding that this should be enough to make browsing and advertising work. What I am observing is that when I call startAdvertisingPeer on my instance of MCNearbyServiceAdvertiser I am not seeing the local network permission prompt instead I see the following error in the console. 2020-06-27 13:26:54.634264+0100 MultiPeerTest[18970:1571288] [MCNearbyServiceAdvertiser] Server did not publish: errorDict [{&#9;&#9; NSNetServicesErrorCode = "-72000";&#9;&#9; NSNetServicesErrorDomain = 10; }]. Further my app is not showing up under Settings > Privacy > Local Network on my iPadOS 14 device. I looked at the TicTacToe example which declares the bonjour service _tictactoe._tcp as its Bonjour Service but this seems to violate the Bonjour specification for service type as described in the documentation for MCNearbyServiceAdvertiser. My code is roughly localPeer = MCPeerID.init(displayName: deviceName)&#9;&#9;&#9;&#9; advertiser = MCNearbyServiceAdvertiser( peer: localPeer, discoveryInfo: [:], serviceType: "my-service" )&#9;&#9;&#9;&#9; browser = MCNearbyServiceBrowser( peer: localPeer, serviceType: "my-service" ) browser.startBrowsingForPeers() advertiser.startAdvertisingPeer() Environment: Xcode 12 Beta(12A6159) iPad Pro(10.5", 2017) running iPadOS 14.0 Beta
Posted
by
Post not yet marked as solved
8 Replies
2.7k Views
I'm using AVPictureInPictureController.isPictureInPictureSupported() to detect PiP feature is supported on the device. It's working on iPadOS 13 and 14. As we know that, iOS 14 is supporting PiP on iPhone. I'm using the same code but it returns false. And I try AVPictureInPictureController(playerLayer: playerLayer).isPictureInPicturePossible, it returns nil. I tested it on iOS 14 beta 1 and beta 2, iPhone X on simulator, it still the same. I also see that the PiP button is also not shown on Safari HTML5 video playback. How to make it work on iOS 14? Or how to enable for it?
Posted
by
Post not yet marked as solved
0 Replies
454 Views
Hello there, Am a developer with Apple,my request is about using Apple Pay wallet as one of my favorite wallet for payments in my account. Please advice me, thanks in advance.
Posted
by
Post not yet marked as solved
4 Replies
1.1k Views
Xcode Version: 12.4 (Used Xcode 12.5 to create encryption key as 12.4 is bugged and will not let you) I am bundling a ML model into my app, and have encrypted ML model via Xcode 12 encryption method shown here. Model loads and works fine without encryption. After adding model encryption key and setting it as compiler flag, the model encryption/decryption works fine in a demo app, but using the same load method in my main app fails with this error message: Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.CoreML Code=0 "Unable to load model at /path_to_model/my_model.mlmodelc/ with error: failed to invoke mremap_encrypted with result = -1, error = 12" I have tried all variations of the load method available within my swift generated model file, all produce the same result above.
Posted
by
Post not yet marked as solved
1 Replies
495 Views
Watching https://developer.apple.com/wwdc20/10644 Using Swift on AWS Lambda with Xcode. When I call http://127.0.0.1:7000/invoke I get Status: 405 Method Not Allowed? in the browser. And my breakpoint does not hit, no message in Xcode either. Did anyone get this to work?
Posted
by
MKX
Post not yet marked as solved
0 Replies
406 Views
Hi , I was watching https://developer.apple.com/videos/play/wwdc2020/10148/ And I did read the demo project code but I don't understand how Apple know when write "W" in textfield it's "W" in drawing area ? I understand that we separate every char from drawing data , but how it matching with the textfield text ? this part it's confusing me
Posted
by
Post not yet marked as solved
0 Replies
479 Views
Hi everybody, for my app that is a dictionary I need to create some ML to analyze the text the users inputs in to search bar of my app to do correct search of word asked. I need principally POS and Lemmatization ML. Can you give me some example of dataset?
Posted
by
Post not yet marked as solved
1 Replies
341 Views
In the WWDC2020 "Keep your complications up to date" session, Mike shows that you have to create an App ID that ends with ".watchkitapp.complication" in order to support PushKit. Any time I try to add ".complication" to the end of my App ID, it gives an error: An App ID with Identifier 'com.x.y.watchkitapp.complication' is not available. Please enter a different string.
Posted
by
Post not yet marked as solved
1 Replies
462 Views
Hello, I downloaded the Pixar kitchen scene from Pixar and opened the rug asset as a USD file. It's about 26k polygons ( unsubdivided mesh ) and only 352kb in size. Converting the rug to a Scenekit or OBJ file increases the file size to 3.7MB! About 10x larger ! How did Pixar manage to optimize/ export a 26k polygon model with a size of 352kb? Is this only possible using their Presto proprietary software ? Is there special settings we need to used to export models created in Maya or 3ds Max with the same file size optimization? The Rug model is 3.7mb when exported as a USD file from 3ds Max.
Posted
by
Post not yet marked as solved
1 Replies
440 Views
Code from wwdc20-10652 is used. Open PHPickerViewController Choose 1 photo Do nothing and dismiss Leak is shown as below. @IBAction private func chooseImagePressed(_ sender: Any) {     if #available(iOS 14, *) {       var configuration = PHPickerConfiguration()       configuration.filter = .images       let picker = PHPickerViewController(configuration: configuration)       picker.delegate = self       present(picker, animated: true)     } else {       // Fallback on earlier versions     } } extension PhotosVC: PHPickerViewControllerDelegate {   @available(iOS 14, *)   func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {     dismiss(animated: true)   } }
Posted
by
Post not yet marked as solved
0 Replies
305 Views
Hello Team, We would like to know what are the options for distributing a custom app via the Apple Business Manager. We have seen your video https://developer.apple.com/videos/play/wwdc2020/10667 where the process is described for using MDM. What if the customer does not have MDM? How can they get the custom app and then distribute it to their users? Thank you in advance,
Post not yet marked as solved
1 Replies
288 Views
Watching "Broaden your reach with Siri Event Suggestions" video and I need to register my domain with Apple. I can't find the form. Thanks
Posted
by
Post not yet marked as solved
0 Replies
343 Views
Whenever user tries to login with biometric and it failed multiple times then the passcode screen appears and this is the screen which we don't want for our perfectly secure applications. We are looking to authenticate user with biometric only and not fallback to passcode which is same as the Apple internal API https://developer.apple.com/documentation/security/secaccesscontrolcreateflags where we can enforce user to use biometric based login. Although we like WebAuth framework but this is the only reason we cannot move forward with the WebAuthn framework. is it something Apple can think about providing to make application more secure ?
Posted
by
Post marked as solved
2 Replies
420 Views
I'm trying to implement additional UICollectionViewCell's state using UIConfigurationStateCustomKey. I've already read this post regarding practices for managing custom properties with cells configurations Seems like I can implement custom state for a UICollectionView's cell. Here is my implementation of additional state protocol CustomTypesAccessable: AnyObject {   var isMoving: Bool { get set } } extension UIConfigurationStateCustomKey {   static let isMoving = UIConfigurationStateCustomKey("com.myApp.cell.isMoving") } extension UICellConfigurationState {   var isMoving: Bool {     get { self[.isMoving] as? Bool ?? false }     set { self[.isMoving] = newValue }   } } extension BaseBlockView: CustomTypesAccessable {   var isMoving: Bool {     get {       currentConfiguration.currentConfigurationState?.isMoving ?? false     }     set {       currentConfiguration.currentConfigurationState?.isMoving = newValue     }   } } My BaseBlockView class: class BaseBlockView<Configuration: BlockConfigurationProtocol>: UIView, UIContentView {   var configuration: UIContentConfiguration {     get { currentConfiguration }     set {       guard let newConfiguration = newValue as? Configuration,          currentConfiguration != newConfiguration else { return }       currentConfiguration = newConfiguration     }   }   var currentConfiguration: Configuration {     didSet {       guard didSetupSubviews else { return }       update(with: currentConfiguration)     }   } I want to update this state from my view controller in the same way CollectionView can make cell selected - (void)selectItemAtIndexPath:(nullable NSIndexPath *)indexPath animated:(BOOL)animated scrollPosition:(UICollectionViewScrollPosition)scrollPosition; So I'm trying to change isMoving value this way:   let cell = cellForItem(at: indexPath)     let contentView = cell?.contentView as? CustomTypesAccessable     contentView?.isMoving = isMoving Everything works fine in that case, but after reusing cell UICellConfigurationState's isMoving property changes back to false. Is it a UIKit bug?
Posted
by
Post not yet marked as solved
1 Replies
343 Views
We're trying to join our audio worker threads to a CoreAudio HAL audio workgroup, but haven't managed to this working yet. Here's what we do: Fetch audio workgroup handle from the CoreAudio device: UInt32 Count = sizeof(os_workgroup_t); os_workgroup_t pWorkgroup = NULL; ::AudioDeviceGetProperty(SomeCoreAudioDeviceHandle, kAudioUnitScope_Global, 0, kAudioDevicePropertyIOThreadOSWorkgroup, &Count, &pWorkgroup); This succeeds on a M1 Mini for the "Apple Inc.: Mac mini Speakers" on OSX 11.1. The returned handle looks fine as well: [(NSObject*)pWorkgroup debugDescription] returns "{xref = 2, ref = 1, name = AudioHALC Workgroup}" Join some freshly created worker threads to the workgroup via: os_workgroup_join_token_s JoinToken; int Result = ::os_workgroup_join(pWorkgroup, &JoinToken); The problem: Result from os_workgroup_join always is EINVAL, Invalid argument - whatever we do. Both arguments, the workgroup handle and the join token are definitely valid. And the device hasn't been stopped or reinitialized here, so the workgroup should not be cancelled? Has anyone else managed to get this working? All examples out there seem to successfully use the AUHAL workgroup instead of the audio device HAL API.
Posted
by