Connect with Apple engineers to discuss the latest Apple technologies announced at WWDC21.

Posts under WWDC21 tag

68 Posts
Sort by:
Post not yet marked as solved
2 Replies
1.3k Views
How can I create a List that starts on the bottom?  All Lists start on top, but in a chat, the desired content scroll direction is up (inverted). There are workarounds that rely on flipping the list with rotation effects, but these don’t work well, and make the logic very hard to understand. Scrolling to bottom after some delay on list appear has side effects (loads rows on top). Ideally there would be List { … }.scrollDirection(.inverted) api, that automatically loads the list from its last item, not first.
Posted Last updated
.
Post not yet marked as solved
0 Replies
411 Views
Yesterday (9/22) I upgraded to Monterey beta. I was able to connect my iMac to my iPad Pro but was unable to engage Universal Control. Today, there was an update to beta 12 and now, I can not connect the two devices. The iPad is reporting that pairing to the Mac was unsuccessful as "iMac is not supported"
Posted
by mpullan.
Last updated
.
Post not yet marked as solved
1 Replies
416 Views
How to turn on AssistiveTouch on Apple Watch? I saw it on the keynote available in WatchOS8. I am using series 3 watch.
Posted
by ethanfan.
Last updated
.
Post not yet marked as solved
2 Replies
545 Views
As per the document, the following conditions should be met in order to retrieve network info: The app uses Core Location, and has the user’s authorization to use location information. The app uses the NEHotspotConfiguration API to configure the current Wi-Fi network. The app has an active VPN configuration installed. The app has an active NEDNSSettingsManager configuration installed. In our case, the application relies on point 3 i.e installed VPN profile. But, I observed that CNCopyCurrentNetworkInfo always returns null even though a VPN profile is configured. This works fine with iOS 14.x versions. I also tried using fetchCurrentWithCompletionHandler API. But, ended up with the same result. Any help/lead would be highly appreciated. Thanks in advance.
Posted
by aman.sa.
Last updated
.
Post not yet marked as solved
0 Replies
192 Views
How can we access to our health care data? Can we download the data through the "Health" app or through the studies in "Apple Research" app? or any other way to access our own data?
Posted
by ailalab.
Last updated
.
Post not yet marked as solved
0 Replies
347 Views
Hi, I had the problem of my navigationBar being transparent on iOS15. After setting the scrollEdgeAppearance as suggested the problem got resolved. appearance.configureWithOpaqueBackground() appearance.backgroundColor = <your tint color> navigationBar.standardAppearance = appearance; navigationBar.scrollEdgeAppearance = navigationBar.standardAppearance However, wherever I have a scrollView embedded in my ViewController, the view is moved up for around 50 px. Did anyone experience a similar issue?
Posted
by gg-fm.
Last updated
.
Post not yet marked as solved
0 Replies
211 Views
I am working with raw DNG files I have captured with iphone 11 and iphone 12 ( I used Moment app to capture raw - not proraw). I want to check the linearity of raw RGB according to ISO. It is (almost) linear up to ISO of around 200. Apparently there is a Baseline Exposure and also a Exposure Bias (or Exposure Compensation) tag available in the DNG. I correct the linear values based on 2^(baselineExposure+exposureCompensation) and it still does not fix my linearity issue. The best fit (very strange) I could get to my data was 2^baselineExposure * 2^(-10*exposureCompensation) but still far from linear. Anybody knows how to interpret those tags? Thanks, Hamid
Posted
by hmirzaei.
Last updated
.
Post not yet marked as solved
0 Replies
377 Views
Hello 👋. At WWDC 21 Apple said that spotlight now has rich results for content like movies and artists and I'm trying to implement this feature. I already have an app that has some move items indexed in the spotlight, but they show up in the old way, not with the new rich results. Any idea if this feature is enabled in the latest beta? I've also watched https://developer.apple.com/videos/play/wwdc2021/10098/ (Showcase app data in Spotlight) session but from my understanding, these are just improvements to the NSCoreDataCoreSpotlightDelegate. Any help is appreciated. 🙏
Posted
by VladS94.
Last updated
.
Post not yet marked as solved
11 Replies
3.9k Views
One of the primary issues with web-development and dealing with Apple/Safari compared to dealing with the Microsoft Edge and Google Chrome teams is that it feels difficult for developers to have voice in Safari development. Requests on bugs.webkit.org go unanswered and critical features which are essential too many developers are never placed in the pipeline. From an outsiders perspective there seems to be several major issues: The Safari team is underfunded and needs more engineers and a much higher budget There is the perception that a significant number of critical features do not get developed because Apple does not want competition with the AppStore and by not developing those features it forces developers to use the App Store instead. Developers are not given an opportunity to communicate with Apple Having worked with Safari development since 2003 when it was one of the best, most feature rich advanced browsers, it's really disappointing for Safari to be the limiting factor in every single app we develop. Apple has the financing and the engineering talent to fix this. How can we open these lines of communication and make Safari one of the best browsers for developers once again?
Posted
by Niskraw.
Last updated
.
Post marked as solved
1 Replies
1k Views
Hello everyone, This is a bit of a first for me, attaching the Safari Debugger to a webview in our app will crash the app itself. This is Safari + macOS 12 + Xcode 13 + iOS 15 Simulator so there are plenty of moving parts still it is a bit of a new one :). Raised the following Bug Report: FB9145558 This is the sanitised stack trace I can share: Process: APPLICATION_NAME [17438] Path: /Users/USER/Library/Developer/CoreSimulator/Devices/1F4F2CDE-64EB-44BA-8B40-1CC6197A3BC9/data/Containers/Bundle/Application/49053572-EE15-41B7-A8D0-94A7D92D76E5/APPLICATION_NAME.app/APPLICATION_NAME Identifier: BUNDLE_PREFIX.APPLICATION_NAME Version: 11.20.7 (2847) Code Type: X86-64 (Native) Role: Foreground Parent Process: launchd_sim [17090] Coalition: com.apple.CoreSimulator.SimDevice.1F4F2CDE-64EB-44BA-8B40-1CC6197A3BC9 [7161] Responsible Process: SimulatorTrampoline [3280] Date/Time: 2021-06-09 13:49:02.3675 +0100 Launch Time: 2021-06-09 13:48:44.2049 +0100 OS Version: macOS 12.0 (21A5248p) Release Type: User Report Version: 104 Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000012 VM Region Info: 0x12 is not in any region. Bytes before following region: 4304560110 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 100926000-101ee2000 [ 21.7M] r-x/r-x SM=COW ...pp/APPLICATION_NAME Exception Note: EXC_CORPSE_NOTIFY Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [17438] Triggered by Thread: 16 Full sanitised stack trace attached. Full Crash Log Kind Regards, Goffredo
Posted Last updated
.
Post not yet marked as solved
1 Replies
403 Views
I have a question about Apple's new API, Object Capture. When I use this API, the 3D model is generated as a usdz file.Is it possible to generate 3D models as obj files, and if so, how?
Posted
by itorin.
Last updated
.
Post not yet marked as solved
2 Replies
535 Views
I have an AVSampleBufferDisplayLayer playing a local video. When adding the AvSampleBufferDisplayLayer as a contentSource to AVPictureInPictureController, it fails with The operation couldn’t be completed. (PGPegasusErrorDomain error -1003.) I can't find any documented error with that domain an error code. Help?
Posted Last updated
.
Post not yet marked as solved
1 Replies
593 Views
The operation couldn’t be completed. (com.apple.ShazamKit error 300.) I get this error when I try to add multiple shazamcatalogs on a single SHCustomCatalog. The fact that the operation is called add(from:) and not 'load' lets me suggest that it's meant to use that way. Am I wrong? And yes, both work work if I just add one at a time. let catalog = SHCustomCatalog() do { try catalog.add(from: url1) try catalog.add(from: url2) {...} } catch {...} . . Also, fun fact: I was working on a ShazamKit myself before dubDub and it's a really fascinating topic and so awesome to see how well it works, great job! 👏
Posted Last updated
.
Post marked as solved
8 Replies
2.1k Views
Hi, I'm using the sample code to create a 3D object from photos using PhotogrammetrySession but it returns this error: Error creating session: cantCreateSession("Native session create failed: CPGReturn(rawValue: -11)") Sample code I've used is this and this. Any idea? Thanks in advance!
Posted
by Rubenfern.
Last updated
.
Post not yet marked as solved
10 Replies
2.2k Views
I'm seeing an error trying to test out async let. It seems like this should work, based on the async/let and structured concurrency session videos. Here's the code: func doIt() async -> String {     let t = TimeInterval.random(in: 0.25 ... 2.0)     Thread.sleep(forTimeInterval: t)     return String("\(Double.random(in: 0...1000))") } async {     async let a = doIt()     async let b = doIt()     async let c = doIt()     async let d = doIt()     let results = await [a, b, c, d]     for result in results {         print("  \(result)")     } } But, I get this error for every "async let" statement: error: AsyncLetSwift55WWDC21.playground:12:15: error: expression is 'async' but is not marked with 'await' async let a = doIt() ^ await Am I missing something key, or is this a bug? I'm running this in the Xcode 13.0 beta, in a Playground. Thanks!!
Posted
by drewster.
Last updated
.
Post not yet marked as solved
0 Replies
397 Views
I've recently updated my mac mini 2018 to big sur and I use wired apple keyboard with it but after updating to big sur my keyboard 2 key and the at sign key stopped working now I know my keyboard is fine because I tested it over my older mac mini late 2012 but its not working in big sur. My keyboard input source is U.S and language of mac mini is also U.S still the issue is coming.
Posted Last updated
.
Post not yet marked as solved
0 Replies
374 Views
Hello, Right when iPadOS 15 is released in the Fall, I'll want to support the new large widget and a few of the other iPadOS 15 specific features. I'm working on them now in the Xcode 13 Beta, but to give these features out to users on launch day, can I update my app now with these features from the Xcode 13 Beta? Or would I need to wait until Xcode 13 is public? Sorry if this question seems simple. This is my first time launching on the new update's release day, so I am unsure if I can submit those new features now or wait until launch day. Please let me know! P.S. My app is multi-platform (iOS, iPadOS, and macOS). Would submitting early from an Xcode beta potentially mess up the other platforms?
Posted
by helper132.
Last updated
.
Post not yet marked as solved
0 Replies
870 Views
We are going through the process of setting up universal control between an Intel Mac (A2141) and an A-chip iPad (A2228). This does not work as of now. We updated to the latest version last night on both the iPad and the MacBook. We can confirm that WiFi is ON Bluetooth is ON Handoff is ON on both devices Still nothing when we move the MacBook cursor to any edge (left, right, top, nothing). _We tried looking online, and found a website that claims that Universal Control will not be available for Intel Macs. _ The question(s): Will Universal Control be available for Intel MacBooks running Monterey? Will it be available for A-Chip iPad pros running iOS 15? Will it be working "between" these two? Thank you. The link to the website for reference: https://www.howtoisolve.com/universal-control-not-working-mac-and-ipad/
Posted Last updated
.
Post marked as solved
1 Replies
454 Views
I created a 3D model using Object Capture. https://developer.apple.com/videos/play/wwdc2021/10076/ I want to know where the image used to create the object model was taken on the object oriented coordinate. Can I get this information from the PhotogrammetrySession?
Posted
by sp9103.
Last updated
.
Post not yet marked as solved
0 Replies
433 Views
Hi everyone, I have a question for the apple store, for example we already have AppExtension as version 1.0.0 published on store, and we are going to migrate to WebExtension (following this doc) and publish as version 2.0.0. Could you help to confirm the following cases? If user is using Safari 14 and never installed 1.0.0, they will install WebExtension 2.0.0 if user is using Safari 14 and installed 1.0.0, after install 2.0.0, the AppExtension will update to WebExtension automatically if user is using Safari 13 and never installed 1.0.0, can they still install AppExtension 1.0.0? if user is using Safari 13 and installed 1.0.0, do they still get upgrade notification to upgrade to 2.0.0? And they can't upgrade successfully right? Thank you very much
Posted
by lu.xu.
Last updated
.