Post not yet marked as solved
Hi, i don't understand how to implemente PKIssuerProvisioningExtensionHandler in my code. I want to provisioning a card in Wallet and confirm in my app.
Post not yet marked as solved
After restarting the Apple TV, Siri remote is not yet connected, I have opened UISearchController using IR Remote, Now UISearchController is displaying both regular horizontal keyboard an also Grid Keyboard, and these two keyboards are getting overlapped.
Please note that, once Siri is connected to the AppleTV, this issue is resolved.
#WWDC20-10634
Post not yet marked as solved
Using UISplitViewController for displaying contact list and contact details screen. Primary, I have UITableViewController.
On didSelectRow: set segue/programatically handle delegate to call showDetailViewControler. it always pushes viewController. Question is instead of replacing secondary why need to push.
Most of solution I found that hide back button and keep on push and increasing navigation stack.
Print this
self.viewController(for: .secondary)?.navigationController?.viewControllers.count
on segue/didSelectRow delegate. count keeps on increasing.
Post not yet marked as solved
In my app timeout time on no user activity is set to 10 minutes.. after 10 minutes of idle time is over , user is logged out of app ... We have recently added a chat page in our app.. Now when app goes into background & app timeout period has exceeded since app went to idle mode even if app is in background we need to call an API to server to end active chat session from server end. So requirements in short are
1)Keep track of idle app time in foreground & even app moves into background.(should we use background processing modes to keep track of time in background mode as well?)
2)Whenever idle app timeout period is exceed(10 minutes) while app is in background hit an api call to end active chat session.(Its working fine in foreground)
Inside
func picker(_ picker: didFinishPicking results:)
I am trying to obtain a UIImage from a Live Photo. I have parsed the results and filtered by regular still photos and live photos. For the still photos, I already have the UIImage but for the live photos I do not understand how to get from a PHLivePhoto to the PHAsset.
if let livePhoto = object as? PHLivePhoto {
DispatchQueue.main.async {
// what code do I insert here to get from PHLivePhoto to a UIImage? I need to extract a UIImage from a PHLivePhoto
}
}
Thanks for the help in advance!
Post not yet marked as solved
I know that APNS is a "best effort" service but could somebody tell me if the Local Push Notifications connectivity for restricted networks is more reliable?
Post not yet marked as solved
I have a local network alert with cellular data. why? How can I locate the cause?
Post not yet marked as solved
We're trying to join our audio worker threads to a CoreAudio HAL audio workgroup, but haven't managed to this working yet.
Here's what we do:
Fetch audio workgroup handle from the CoreAudio device:
UInt32 Count = sizeof(os_workgroup_t);
os_workgroup_t pWorkgroup = NULL;
::AudioDeviceGetProperty(SomeCoreAudioDeviceHandle, kAudioUnitScope_Global, 0,
kAudioDevicePropertyIOThreadOSWorkgroup, &Count, &pWorkgroup);
This succeeds on a M1 Mini for the "Apple Inc.: Mac mini Speakers" on OSX 11.1.
The returned handle looks fine as well:
[(NSObject*)pWorkgroup debugDescription] returns
"{xref = 2, ref = 1, name = AudioHALC Workgroup}"
Join some freshly created worker threads to the workgroup via:
os_workgroup_join_token_s JoinToken;
int Result = ::os_workgroup_join(pWorkgroup, &JoinToken);
The problem:
Result from os_workgroup_join always is EINVAL, Invalid argument - whatever we do. Both arguments, the workgroup handle and the join token are definitely valid. And the device hasn't been stopped or reinitialized here, so the workgroup should not be cancelled?
Has anyone else managed to get this working? All examples out there seem to successfully use the AUHAL workgroup instead of the audio device HAL API.
I'm trying to implement additional UICollectionViewCell's state using UIConfigurationStateCustomKey.
I've already read this post regarding practices for managing custom properties with cells configurations
Seems like I can implement custom state for a UICollectionView's cell.
Here is my implementation of additional state
protocol CustomTypesAccessable: AnyObject {
var isMoving: Bool { get set }
}
extension UIConfigurationStateCustomKey {
static let isMoving = UIConfigurationStateCustomKey("com.myApp.cell.isMoving")
}
extension UICellConfigurationState {
var isMoving: Bool {
get { self[.isMoving] as? Bool ?? false }
set { self[.isMoving] = newValue }
}
}
extension BaseBlockView: CustomTypesAccessable {
var isMoving: Bool {
get {
currentConfiguration.currentConfigurationState?.isMoving ?? false
}
set {
currentConfiguration.currentConfigurationState?.isMoving = newValue
}
}
}
My BaseBlockView class:
class BaseBlockView<Configuration: BlockConfigurationProtocol>: UIView, UIContentView {
var configuration: UIContentConfiguration {
get { currentConfiguration }
set {
guard let newConfiguration = newValue as? Configuration,
currentConfiguration != newConfiguration else { return }
currentConfiguration = newConfiguration
}
}
var currentConfiguration: Configuration {
didSet {
guard didSetupSubviews else { return }
update(with: currentConfiguration)
}
}
I want to update this state from my view controller in the same way CollectionView can make cell selected
- (void)selectItemAtIndexPath:(nullable NSIndexPath *)indexPath animated:(BOOL)animated scrollPosition:(UICollectionViewScrollPosition)scrollPosition;
So I'm trying to change isMoving value this way:
let cell = cellForItem(at: indexPath)
let contentView = cell?.contentView as? CustomTypesAccessable
contentView?.isMoving = isMoving
Everything works fine in that case, but after reusing cell UICellConfigurationState's isMoving property changes back to false.
Is it a UIKit bug?
Post not yet marked as solved
Whenever user tries to login with biometric and it failed multiple times then the passcode screen appears and this is the screen which we don't want for our perfectly secure applications. We are looking to authenticate user with biometric only and not fallback to passcode which is same as the Apple internal API https://developer.apple.com/documentation/security/secaccesscontrolcreateflags where we can enforce user to use biometric based login.
Although we like WebAuth framework but this is the only reason we cannot move forward with the WebAuthn framework. is it something Apple can think about providing to make application more secure ?
Post not yet marked as solved
Watching "Broaden your reach with Siri Event Suggestions" video and I need to register my domain with Apple. I can't find the form. Thanks
Post not yet marked as solved
Hello Team,
We would like to know what are the options for distributing a custom app via the Apple Business Manager.
We have seen your video https://developer.apple.com/videos/play/wwdc2020/10667 where the process is described for using MDM.
What if the customer does not have MDM? How can they get the custom app and then distribute it to their users?
Thank you in advance,
Post not yet marked as solved
Code from wwdc20-10652 is used.
Open PHPickerViewController
Choose 1 photo
Do nothing and dismiss
Leak is shown as below.
@IBAction private func chooseImagePressed(_ sender: Any) {
if #available(iOS 14, *) {
var configuration = PHPickerConfiguration()
configuration.filter = .images
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
} else {
// Fallback on earlier versions
}
}
extension PhotosVC: PHPickerViewControllerDelegate {
@available(iOS 14, *)
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
dismiss(animated: true)
}
}
Post not yet marked as solved
Hello,
I downloaded the Pixar kitchen scene from Pixar and opened the rug asset as a USD file. It's about 26k polygons ( unsubdivided mesh ) and only 352kb in size.
Converting the rug to a Scenekit or OBJ file increases the file size to 3.7MB! About 10x larger !
How did Pixar manage to optimize/ export a 26k polygon model with a size of 352kb? Is this only possible using their Presto proprietary software ?
Is there special settings we need to used to export models created in Maya or 3ds Max with the same file size optimization? The Rug model is 3.7mb when exported as a USD file from 3ds Max.
Post not yet marked as solved
In the WWDC2020 "Keep your complications up to date" session, Mike shows that you have to create an App ID that ends with ".watchkitapp.complication" in order to support PushKit. Any time I try to add ".complication" to the end of my App ID, it gives an error:
An App ID with Identifier 'com.x.y.watchkitapp.complication' is not available. Please enter a different string.
Post not yet marked as solved
Hi everybody, for my app that is a dictionary I need to create some ML to analyze the text the users inputs in to search bar of my app to do correct search of word asked. I need principally POS and Lemmatization ML. Can you give me some example of dataset?
Post not yet marked as solved
Hi ,
I was watching
https://developer.apple.com/videos/play/wwdc2020/10148/
And I did read the demo project code
but I don't understand how Apple know when write "W" in textfield it's "W" in drawing area ?
I understand that we separate every char from drawing data , but how it matching with the textfield text ?
this part it's confusing me
Post not yet marked as solved
Watching https://developer.apple.com/wwdc20/10644
Using Swift on AWS Lambda with Xcode.
When I call http://127.0.0.1:7000/invoke I get Status: 405 Method Not Allowed? in the browser. And my breakpoint does not hit, no message in Xcode either. Did anyone get this to work?
Post not yet marked as solved
https://developer.apple.com/contact/request/sirikit-media-intent-for-homepod
Post not yet marked as solved
Xcode Version: 12.4 (Used Xcode 12.5 to create encryption key as 12.4 is bugged and will not let you)
I am bundling a ML model into my app, and have encrypted ML model via Xcode 12 encryption method shown here. Model loads and works fine without encryption. After adding model encryption key and setting it as compiler flag, the model encryption/decryption works fine in a demo app, but using the same load method in my main app fails with this error message:
Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.CoreML Code=0 "Unable to load model at /path_to_model/my_model.mlmodelc/ with error: failed to invoke mremap_encrypted with result = -1, error = 12"
I have tried all variations of the load method available within my swift generated model file, all produce the same result above.