Posts

Post not yet marked as solved
2 Replies
2.4k Views
Hi,I'm researching possible ways to stream external video and audio into an iOS device, up until now I came up with the the following "solutions" (more like hacks) -Developing MFi CameraI couldn't find any existing external camera that works using MFi so I guess it's reasonable that Apple are not allowing MFi cameras, will appreciate official say on the manner.Wifi CameraThere are several problems with this - Connectivity-wise - The iOS device needs to be on the same network as the wifi camera, if the camera provides its own network internet connectivity won't work (Wifi + Celluar can't work together right?)LatencyMultipeerConnectivity - This is not really a solution since it allows connecting to other iOS devices, right?Lightning to USB 3 Adapter - this allows connecting a PTP/mass storage device only to iPads afaikWill appreciate any other ideas on how to connect to an external camera stream!
Posted Last updated
.
Post marked as solved
3 Replies
1.5k Views
While developing our app that uses CoreLocation and has Always permissions we've stumbled on a weird behaviour -As stated in the latest documentation, on iOS 11 and above apps that have Always permissions and use location in the background need to use showsBackgroundLocationIndicator if they want the OS to show the blue bar indicator, the issue is that when we test our app on a device that uses old iOS version (10.3.2), the OS doesn't show the blur-bar indicator. The showsBackgroundLocationIndicator is wrapped in a if #available(iOS 11.0, *) { ... } clause othewise it won't compile so that line doesn't get called when running on iOS 10.I expected the blue bar to be shown as it's the default behaviour prior to iOS 11 (maybe I'm missing something here) but I can't get it to work on iOS 10. If I remove the always permissions from the app it works as expected.Thanks,Nimrod
Posted Last updated
.
Post marked as solved
3 Replies
2.1k Views
Hi,We're writing code to decode a real-time network RTSP video stream (using ffmpeg + ************) on iOS, we've experienced issues that *seems* to be related to our work-queue loop getting throttled/paused/whatever during the parsing+decoding process. Under load (and noticably more on weaker devices) we experience "gaps" - times that the thread seems to be paused so the video stream jumps - we've boiled the code down to the basics of just parsing the rtsp stream and decoding h264 frames and still get "gaps" (packet loss is minimal from what we've checked with rvictl and wireshark)I've stumbled upon Technical Note TN2169 - High Precision Timers in iOS / OS X which explains how to set the pthread to real-time scheduling but want to make sure it actually works on iOS (on the comments of this article from 2011 someone stated that Apple disabled real-time scheduling of pthreads on iOS) and if it's the correct solution to the issue.Running the code with real-time pthread gives the impression that there are less gaps (we're printing the times of the gaps while streaming) but we're working on this for so long that it might be a lie.Thanks,Nimrod
Posted Last updated
.