I'm making an app:
- It displays the image of the rear camera in real time. (in
AR View)(✅) - When it detects a QR code, it will read the text content in the QR code.(I don't know how to do it in real time, get data from AR session)
func getQRCodeContent(_ pixel: CVPixelBuffer) -> String {
let requestHandler = VNImageRequestHandler(cvPixelBuffer: pixel, options: [:])
let request = VNDetectBarcodesRequest()
request.symbologies = [.qr]
try! requestHandler.perform([request])
let result = request.results?.first?.payloadStringValue
if let result = result {
return result
} else {
return "non"
}
}
- And then do some logic with the content, and display the corresponding AR model in the AR View.
I know I have to feed images into the Vision Framework, I started with AVFoundation, but I found that when AR View is loaded, the AVCaptureSession is paused.
And I want to feed AR Session's frame into Vision Framework. However, all the tutorials I can find are based on story board and UI kit to complete this function. I don't know how to complete this function in Swift UI at all.
I tried to extent ARView:
extension ARView: ARSessionDelegate {
func renderer(_ renderer: SKRenderer, willRenderScene scene: SCNScene, atTime time: TimeInterval) {
let capturedImage = session.currentFrame?.capturedImage
print(capturedImage)
}
}
struct ARViewCustom: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.session.delegate = arView
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
No error, but it doesn't work.