I am writing to follow up with my lab in WWDC24. I had 1:1 lab with Mr. Kavin, we had good 30 minutes lab and for follow up questions Kavin asked me to post it using feedback. Following is my questin: We have screenshare in our application and trying to use CFMessagePort for passing CVPixelBufferRef from broadcast extension to Applicaiton. Questions: How to copy planes of IOSurface backed CVPixelBufferRef onto another one without using memcpy, is there a zero-copy method? How to get notified when an IOSurface backed CVPixelBufferRef data get changed by another process. How to send an IOSurface backed CVPixelBufferRef from Broadcast Extension to application. How to pass unowned IOSurfaceRef from the Broadcast Extension to appliction.
Search results for
A Summary of the WWDC25 Group Lab
10,084 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I cannot seem to change my address and phone in Developer Account Summary sectionMember Center > Your Account > Developer Account Summaryit seems to have different information vs Member Center > Your Account > Apple ID Summaryso changing my Apple ID info does nothing to Developer Account Summary
IN the Developer Account Summary, the name is different than my name . my surname isnt seen in this section . Is that a problem ? How can i change that ?thank you
Hi all. I am working on weatherkit rest api to integrate to my android app. As a Dark Sky api user I can get summaries text which really helpful for users especially who interested for the next hour rain summary text. But weatherkit doesn't give any human readable text summaries. I ask for it from weatherkit developers as soon as possible. Is there recomandation to create a simple way for human readable text summaries for now?
you guys literally need update this video of introducing the swifui webview. these apis even not exist... https://developer.apple.com/videos/play/wwdc2025/231/
I watched this year WWDC25 Read Documents using the Vision framework. At the end of video there is mention of new DetectHandPoseRequest model for hand pose detection in Vision API. I looked Apple documentation and I don't see new revision. Moreover probably typo in video because there is only DetectHumanPoseRequst (swift based) and VNDetectHumanHandPoseRequest (obj-c based) (notice lack of Human prefix in WWDC video) First one have revision only added in iOS 18+: https://developer.apple.com/documentation/vision/detecthumanhandposerequest/revision-swift.enum/revision1 Second one have revision only added in iOS14+: https://developer.apple.com/documentation/vision/vndetecthumanhandposerequestrevision1 I don't see any new revision targeting iOS26+
Hi. I’m planning on requesting a developer lab next week. Specifically, I plan on asking about the use of compiler technology in iOS apps. As far as I’m aware the dev labs are centered around specific APIs like CoreML, CreateML, etc. I’m not sure what category compilers fall into. Should I just sign up for a general swift lab? Does it even exist? This is my first wwdc so any advice would be lovely. thanks in advance! :)
How can I achieve the result of buttons glass effect like sample videos that was show at de WWDC25? I tried a lot of approaches and I still far a way from the video. I would like something like the pictures attached. Could send a sample code the get the same result? Thanks
The new summary notifications are great but are we suppose to receive an alert when it’s time to review the summary? For example, my afternoon summary is ready and It shows when I turn on the screen but I wasn’t alerted.
I'm currently using iPhone 5s and 7 plus (with latest firmware) and the problem lies on both the devices, the call summary stuck's my devices ( sometimes it work's fine ) not allowing to accept or done(call summary) and finally I need to restart my phone several times
Since December 13th, my Summary Sales Reports are empty. The files contain only the coloum header.Anyone else is experiencing this?
My WWDC24 Developer Lab confirmation email said to come here if there are questions or issues. Well I was on the Webex at the appointed time and no one showed up. What is my next course of action?
Topic:
Community
SubTopic:
Apple Developers
Hi, I was going to apply to the Developer Labs for Vision Pro, but I've never gone through this process in the Apple ecosystem. Are the developer labs still open? On the website, there is a button to apply, but it takes me to a page with no upcoming events. I want to make sure I'm not missing something, or need to upgrade my account in order to apply to the developer labs https://developer.apple.com/visionos/labs/
I cannot seem to get AU Lab to see the FilterDemo v3 AU created by the Apple sample code in AU Lab. I can see it in Garage Band. I found some references to AU Lab not supporting v3 AUs, though it is almost shocking that an Apple test tool doesn't support AU v3. I looked for a later version of AU Lab but that is also frustrating (my version is 2.2.2).So the question is simple - is there a version of AU Lab that does support v3 AUs, and if so, where can I find it? If not, can someone recommend a good alternative? Using GarageBand or Logic Pro is really not what I want to do when I'm testing AudioUnits, especially when I want more free form routing.
As part of the 6/12/25 SwiftUI Group lab, Curt and Taylor spoke to this specifically related to use of color in liquid glass SwiftUI views and that use of color should be limited. However, they didn’t speak to how to put color back into List View outline disclosure carets that aren’t there in new iOS/iPad OS 26. I would still like to know if I can add color back into the carets. Here was my group lab question that was discussed: “I’ve noticed with the new design language, SwiftUI views appear to not use color as much. Example, color modifiers for List View items like carets. Is this intended and can developers introduce color back into SwiftUI view elements, if desired, like in iOS/iPadOS 18?”
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags: