API to enable control of motorized iPhone stands from within any Camera app.

Posts under DockKit tag

10 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

dockkit anomolies -- anyone else seen these?
I have two Dockkit anomolies to report. Hoping a DTS person has seen these and/or can comment. First, my setup: I am controlling the accessory by making repeated calls to set the angular velocity. And the first thing I do is make a call dockManager.setSystemTrackingEnabled(false) because I'm doing my own tracking. I would note that I tried calling track() on my own, with a bunch of observation rectangles (or even just one) but it didn't work well, even though I was calling at the correct rate. Instead, I measure the angular deviation to where I wish my camera was pointed, and set the angular velocity proportional to the error. First issue: in normal operation, the green tracking light is on, on the hardware (the Instaflow Pro 360 motorized dock). Squeezing the trigger toggles the green light on/off; only when the light is on will the dock accept my calls to set the angular velocity. Fine. But sometimes squeezing the trigger won't reactivate the green light. In this case, the ONLY thing that seems to work is switching to the Instaflow Pro 360 app, and activating the camera. Immediately the green light turns on, and I'm good (and can return to my own app, with the green light still on). So what hidden API call does Instaflow have, that I don't that can make this happen? Sure, it's their own app, but I imagine they don't have access to calls I don't, so how does their app manage to get the green light back on? It doesn't always happen. Would love to know how to snap out of this. Second issue: While I usually use rectangle from running the vision system to guide my camera position, sometimes I left the user directly control the angular "yaw" velocity (rotation around the vertical axis) directly (by issuing commands over the network). Sometimes, when the user sets a non-zero velocity, when they set a zero velocity a short time later, the camera doesn't immedately respond and stop. (It's not a network issue. I can verify the API sends a call to set the angular velocity to zero, and the camera keeps rotating for a good fraction of a second.) Most times the camera stops immediately, but sometimes it doesn't. Oddly, I never see this issue when letting the user set the angular velocity in the "pitch up/down" axis. Just the yaw axis. Anybody else seen this? I feel like it wasn't a problem till I got to iOS18 but I won't swear to it. Any advice/assistance/discussion greatly appreciated.
1
0
177
Nov ’24
DockKit tracking becomes erratic with increased zoom factor in iOS app
I'm developing an iOS app using DockKit to control a motorized stand. I've noticed that as the zoom factor of the AVCaptureDevice increases, the stand's movement becomes increasingly erratic up and down, almost like a pendulum motion. I'm not sure why this is happening or how to fix it. Here's a simplified version of my tracking logic: func trackObject(_ boundingBox: CGRect, _ dockAccessory: DockAccessory) async throws { guard let device = AVCaptureDevice.default(for: .video), let input = try? AVCaptureDeviceInput(device: device) else { fatalError("Camera not available") } let currentZoomFactor = device.videoZoomFactor let dimensions = device.activeFormat.formatDescription.dimensions let referenceDimensions = CGSize(width: CGFloat(dimensions.width), height: CGFloat(dimensions.height)) let intrinsics = calculateIntrinsics(for: device, currentZoom: Double(currentZoomFactor)) let deviceOrientation = UIDevice.current.orientation let cameraOrientation: DockAccessory.CameraOrientation = { switch deviceOrientation { case .landscapeLeft: return .landscapeLeft case .landscapeRight: return .landscapeRight case .portrait: return .portrait case .portraitUpsideDown: return .portraitUpsideDown default: return .unknown } }() let cameraInfo = DockAccessory.CameraInformation( captureDevice: input.device.deviceType, cameraPosition: input.device.position, orientation: cameraOrientation, cameraIntrinsics: useIntrinsics ? intrinsics : nil, referenceDimensions: referenceDimensions ) let observation = DockAccessory.Observation( identifier: 0, type: .object, rect: boundingBox ) let observations = [observation] try await dockAccessory.track(observations, cameraInformation: cameraInfo) } func calculateIntrinsics(for device: AVCaptureDevice, currentZoom: Double) -> matrix_float3x3 { let dimensions = CMVideoFormatDescriptionGetDimensions(device.activeFormat.formatDescription) let width = Float(dimensions.width) let height = Float(dimensions.height) let diagonalPixels = sqrt(width * width + height * height) let estimatedFocalLength = diagonalPixels * 0.8 let fx = Float(estimatedFocalLength) * Float(currentZoom) let fy = fx let cx = width / 2.0 let cy = height / 2.0 return matrix_float3x3( SIMD3<Float>(fx, 0, cx), SIMD3<Float>(0, fy, cy), SIMD3<Float>(0, 0, 1) ) } I'm calling this function regularly (10-30 times per second) with updated bounding box information. The erratic movement seems to worsen as the zoom factor increases. Questions: Why might increasing the zoom factor cause this erratic movement? I'm currently calculating camera intrinsics based on the current zoom factor. Is this approach correct, or should I be doing something differently? Are there any other factors I should consider when using DockKit with a variable zoom? Could the frequency of calls to trackRider (10-30 times per second) be contributing to the erratic movement? If so, what would be an optimal frequency? Any insights or suggestions would be greatly appreciated. Thanks!
8
0
386
Nov ’24
DockKit in custom App Not Tracking anymore after updating to iOS 18
Hello, I‘m using DockKit within my SwiftUI Application with GetStream. Before updating to iOS 18 yesterday the custom Tracking using DockKit worked like a charm, but After updating it stopped working unexpectedly. What‘s more curious: using the official GetStream Video Calls Application it works on iOS18 still, but Not within my Application. I can confirm, that my iPhone is still paired and I can receive logs about the current docking State and everything seems fine. Any suggestions what I‘m missing here?
0
0
354
Sep ’24
About the DockKit API in iOS 18
We are experimenting with the DockKit API in iOS 18. However, we are unable to retrieve the speakingConfidence, lookingAtCameraConfidence, and saliencyRank for the person being tracked. We are able to get the rect and identifier. Has anyone been able to retrieve speakingConfidence, lookingAtCameraConfidence, and saliencyRank?
7
0
642
Aug ’24
About the DockKit API in iOS 18
We are experimenting with the DockKit API in iOS 18. However, we are unable to retrieve the speakingConfidence, lookingAtCameraConfidence, and saliencyRank for the person being tracked. We are able to get the rect and identifier. Has anyone been able to retrieve speakingConfidence, lookingAtCameraConfidence, and saliencyRank?
0
0
254
Aug ’24
About the DockKit API in iOS 18
We are experimenting with the DockKit API in iOS 18. However, we are unable to retrieve the speakingConfidence, lookingAtCameraConfidence, and saliencyRank for the person being tracked. We are able to get the rect and identifier. Has anyone been able to retrieve speakingConfidence, lookingAtCameraConfidence, and saliencyRank?
1
0
305
Aug ’24
Programming with DockKit
Any source code samples for how to program DockKit ? I have read https://developer.apple.com/documentation/DockKit and would like to see it used in an app. For instance, how to setup notification in a SwiftUI-based app running code like this do { for await accessory in try DockAccessoryManager.shared.accessoryStateChanges { // If this is an accessory you’re interested in, save it for later use. } } catch { log(“Failed fetching state changes, \(error)“) }
3
0
546
Aug ’24
What is the maximum data processing speed?
For example: we use DocKit for birdwatching, so we have an unknown field distance and direction. Distance = ? Direction = ? For example, the rock from which the observation is made. The task is to recognize the number of birds caught in the frame, add a detection frame and collect statistics. Question: What is the maximum number of frames processed with custom object recognition? If not enough, can I do the calculations myself and transfer to DokKit for fast movement?
0
0
667
Apr ’24
Motorized dock hardware for DockKit?
I'd love to play around with DockKit, but I didn't see anything mentioned about hardware. I'm assuming Apple isn't releasing their own motorized dock and haven't seen anything about how to get hardware recognized by the accessory manager. I'd like to prototype a dock myself using esp32 and some stepper motors. I've already got this working with bluetooth communication from iOS via CoreBluetooth, but I don't know if there's specific service and characteristic UUIDs that the system is looking for to say it's compatible with DockKit? Would really love to start playing with this, anyone got any insights on how to get up and running?
7
2
2.4k
Jan ’24