View in English

  • Apple Developer
    • Get Started

    Explore Get Started

    • Overview
    • Learn
    • Apple Developer Program

    Stay Updated

    • Latest News
    • Hello Developer
    • Platforms

    Explore Platforms

    • Apple Platforms
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    • App Store

    Featured

    • Design
    • Distribution
    • Games
    • Accessories
    • Web
    • Home
    • CarPlay
    • Technologies

    Explore Technologies

    • Overview
    • Xcode
    • Swift
    • SwiftUI

    Featured

    • Accessibility
    • App Intents
    • Apple Intelligence
    • Games
    • Machine Learning & AI
    • Security
    • Xcode Cloud
    • Community

    Explore Community

    • Overview
    • Meet with Apple events
    • Community-driven events
    • Developer Forums
    • Open Source

    Featured

    • WWDC
    • Swift Student Challenge
    • Developer Stories
    • App Store Awards
    • Apple Design Awards
    • Apple Developer Centers
    • Documentation

    Explore Documentation

    • Documentation Library
    • Technology Overviews
    • Sample Code
    • Human Interface Guidelines
    • Videos

    Release Notes

    • Featured Updates
    • iOS
    • iPadOS
    • macOS
    • watchOS
    • visionOS
    • tvOS
    • Xcode
    • Downloads

    Explore Downloads

    • All Downloads
    • Operating Systems
    • Applications
    • Design Resources

    Featured

    • Xcode
    • TestFlight
    • Fonts
    • SF Symbols
    • Icon Composer
    • Support

    Explore Support

    • Overview
    • Help Guides
    • Developer Forums
    • Feedback Assistant
    • Contact Us

    Featured

    • Account Help
    • App Review Guidelines
    • App Store Connect Help
    • Upcoming Requirements
    • Agreements and Guidelines
    • System Status
  • Quick Links

    • Events
    • News
    • Forums
    • Sample Code
    • Videos
 

Vidéos

Ouvrir le menu Fermer le menu
  • Collections
  • Toutes les vidéos
  • À propos

Plus de vidéos

  • À propos
  • Code
  • Detect people, faces, and poses using Vision

    Discover the latest updates to the Vision framework to help your apps detect people, faces, and poses. Meet the Person Segmentation API, which helps your app separate people in images from their surroundings, and explore the latest contiguous metrics for tracking pitch, yaw, and the roll of the human head. And learn how these capabilities can be combined with other APIs like Core Image to deliver anything from simple virtual backgrounds to rich offline compositing in an image-editing app.

    To get the most out of this session, we recommend watching “Detect Body and Hand Pose with Vision” from WWDC20 and “Understanding Images in Vision Framework” from WWDC19.
    To learn even more about people analysis, see “Detect Body and Hand Pose with Vision” from WWDC20 and “Understanding Images in Vision Framework” from WWDC19.

    Ressources

    • Applying Matte Effects to People in Images and Video
    • Vision
      • Vidéo HD
      • Vidéo SD

    Vidéos connexes

    WWDC22

    • What's new in Vision

    WWDC21

    • Classify hand poses and actions with Create ML

    WWDC20

    • Detect Body and Hand Pose with Vision

    WWDC19

    • Understanding Images in Vision Framework
  • Rechercher dans cette vidéo…
    • 8:13 - Get segmentation mask from an image

      // Create request 
      let request = VNGeneratePersonSegmentationRequest()
      
      // Create request handler
      let requestHandler = VNImageRequestHandler(url: imageURL, options: options)
      
      // Process request
      try requestHandler.perform([request])
      
      // Review results
      let mask = request.results!.first!
      let maskBuffer = mask.pixelBuffer
    • 8:33 - Configuring the segmentation request

      let request = VNGeneratePersonSegmentationRequest()
      
      request.revision = 
      VNGeneratePersonSegmentationRequestRevision1
      
      request.qualityLevel = 
      VNGeneratePersonSegmentationRequest.QualityLevel.accurate
      
      request.outputPixelFormat = 
      kCVPixelFormatType_OneComponent8
    • 12:24 - Applying a segmentation mask

      let input = CIImage?(contentsOf: imageUrl)!
      let mask = CIImage(cvPixelBuffer: maskBuffer)
      let background = CIImage?(contentsOf: backgroundImageUrl)!
      
      let maskScaleX = input.extent.width / mask.extent.width
      let maskScaleY = input.extent.height / mask.extent.height
      let maskScaled = mask.transformed(by: __CGAffineTransformMake(
                                        maskScaleX, 0, 0, maskScaleY, 0, 0))
      
      let backgroundScaleX = input.extent.width / background.extent.width
      let backgroundScaleY = input.extent.height / background.extent.height
      let backgroundScaled = background.transformed(by: __CGAffineTransformMake(
                                backgroundScaleX, 0, 0, backgroundScaleY, 0, 0))
      
      let blendFilter = CIFilter.blendWithRedMask()
      blendFilter.inputImage = input
      blendFilter.backgroundImage = backgroundScaled 
      blendFilter.maskImage = maskScaled
      
      let blendedImage = blendFilter.outputImage
    • 14:37 - Segmentation from AVCapture

      private let photoOutput = AVCapturePhotoOutput()
      …
      if self.photoOutput.isPortraitEffectsMatteDeliverySupported {
         self.photoOutput.isPortraitEffectsMatteDeliveryEnabled = true
      }
      
      open class AVCapturePhoto {
      …
      var portraitEffectsMatte: AVPortraitEffectsMatte? { get } // nil if no people in the scene
      …
      }
    • 14:58 - Segmentation in ARKit

      if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
      // Proceed with getting Person Segmentation Mask
      …
      }
      
      open class ARFrame {
      …
      var segmentationBuffer: CVPixelBuffer? { get }
      …
      }
    • 15:31 - Segmentation in CoreImage

      let input = CIImage?(contentsOf: imageUrl)!
      
      let segmentationFilter = CIFilter.personSegmentation()
      segmentationFilter.inputImage = input
      
      let mask = segmentationFilter.outputImage

Developer Footer

  • Vidéos
  • WWDC21
  • Detect people, faces, and poses using Vision
  • Open Menu Close Menu
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    • App Store
    Open Menu Close Menu
    • Swift
    • SwiftUI
    • Swift Playground
    • TestFlight
    • Xcode
    • Xcode Cloud
    • Icon Composer
    • SF Symbols
    Open Menu Close Menu
    • Accessibility
    • Accessories
    • Apple Intelligence
    • Audio & Video
    • Augmented Reality
    • Business
    • Design
    • Distribution
    • Education
    • Games
    • Health & Fitness
    • In-App Purchase
    • Localization
    • Maps & Location
    • Machine Learning & AI
    • Security
    • Safari & Web
    Open Menu Close Menu
    • Documentation
    • Downloads
    • Sample Code
    • Videos
    Open Menu Close Menu
    • Help Guides & Articles
    • Contact Us
    • Forums
    • Feedback & Bug Reporting
    • System Status
    Open Menu Close Menu
    • Apple Developer
    • App Store Connect
    • Certificates, IDs, & Profiles
    • Feedback Assistant
    Open Menu Close Menu
    • Apple Developer Program
    • Apple Developer Enterprise Program
    • App Store Small Business Program
    • MFi Program
    • Mini Apps Partner Program
    • News Partner Program
    • Video Partner Program
    • Security Bounty Program
    • Security Research Device Program
    Open Menu Close Menu
    • Meet with Apple
    • Apple Developer Centers
    • App Store Awards
    • Apple Design Awards
    • Apple Developer Academies
    • WWDC
    Read the latest news.
    Get the Apple Developer app.
    Copyright © 2026 Apple Inc. All rights reserved.
    Terms of Use Privacy Policy Agreements and Guidelines