How to remove extra noise and detect the bounce point in a 2d image frame using VNDetectTrajectoryRequest in IOS?

The 2-d image frame is extracted from a live/ pre-recorded video where the camera is placed behind one player so that the complete tennis court is visible in the frame. The court detection and ball detection have been done using CoreML and Vision APIs. Next step is to detect the trajectory and the bounce point of the ball to see if the ball is in/out of the court for scoring and analysis. I've used VNDetectTrajectoryRequest to draw the trajectory of the ball and used the detected court boundingBox as the ROI for trajectory detection.The problem is I am not able to remove the extra noise (coming from player movement in each frame) from the detection as the player is also in ROI. Next, How should I proceed with the ball bounce detection?

private func detectTrajectories(_ controller: CameraViewController, _ buffer : CMSampleBuffer, _ orientation : CGImagePropertyOrientation) throws {
    
    let visionHandler = VNImageRequestHandler(cmSampleBuffer: buffer,
                                              orientation: orientation,
                                              options: [:])
    
    let normalizedFrame = CGRect(x: 0, y: 0, width: 1, height: 1)
    DispatchQueue.main.async {
        // Get the frame of the rendered view.
        self.trajectoryView.frame = controller.viewRectForVisionRect(normalizedFrame)
        self.trajectoryView.roi = controller.viewRectForVisionRect(normalizedFrame)
        
    }
    
    //setup trajectory request
    setUpDetectTrajectoriesRequestWithMaxDimension()
    
    do {
        // Help manage the real-time use case to improve the precision versus delay tradeoff.
        detectTrajectoryRequest.targetFrameTime = .zero
        
        // The region of interest where the object is moving in the normalized image space.
        detectTrajectoryRequest.regionOfInterest = normalizedFrame
        
        try visionHandler.perform([detectTrajectoryRequest])
    } catch {
        print("Failed to perform the trajectory request: \(error.localizedDescription)")
        return
    }
}

func setUpDetectTrajectoriesRequestWithMaxDimension() {

    detectTrajectoryRequest = VNDetectTrajectoriesRequest(frameAnalysisSpacing: .zero, trajectoryLength: trajectoryLength, completionHandler: completionHandler)
    //        detectTrajectoryRequest.
    detectTrajectoryRequest
        .objectMinimumNormalizedRadius = 0.003
    detectTrajectoryRequest.objectMaximumNormalizedRadius = 0.005
    
}

private func completionHandler(request: VNRequest, error: Error?) {
    if let e = error {
        print(e)
        return
    }
    
    guard let observations = request.results as? [VNTrajectoryObservation] else { return }
    let relevantTrajectory = observations.filter { $0.confidence > trajectoryDetectionConfidence}
    
    if let trajectory = relevantTrajectory.first {
        
        DispatchQueue.main.async {
            print(trajectory.projectedPoints.count)
            self.trajectoryView.duration = trajectory.timeRange.duration.seconds
            self.trajectoryView.points = trajectory.detectedPoints
            self.trajectoryView.performTransition(.fadeIn, duration: 0.05)
            
            if !self.trajectoryView.fullTrajectory.isEmpty {
                self.trajectoryView.roi = CGRect(x: 0, y: 0, width: 1, height: 1)
            }
        }

        DispatchQueue.main.asyncAfter(deadline: .now() + 1.5, execute: {
            self.trajectoryView.resetPath()
        })
    }
}

In the completion handler function, I have removed all the VNTrajectoryObservation that have a confidence of less than 0.9. After that, I have created a trajectoryView that displays the detected trajectory on the frame.