Post not yet marked as solved
Since there are only 80 class labels for existing YOLOv3 Coreml model, I want to add some more categories to be used in my app, can I do that? If this is possible, how can I do?
Post not yet marked as solved
I achieved streaming depth map using ARKit by taking reference to this GitHub code:
https://github.com/TokyoYoshida/ExampleOfiOSLiDAR/blob/main/ExampleOfiOSLiDAR/Samples/Depth/DepthMapViewController.swift
Now I would like to find the point which is closest to the rear camera and show the point on the depth map. It is like the codes can automatically detect the closest point and label it when the app is running. How can I get the closest pixel point with ARKit and CVPixelBuffer?
Post not yet marked as solved
I'm new to Swift, but I have to make an app that can real-time detect any obstacle and measure the distance from the iPhone to the obstacle. Is it feasible to use ARKit to capture clear depth data in this case?