I am capturing a screenshot with SCScreenshotManager's captureImageWithFilter. The resulting PNG has the same resolution as the PNG taken from Command-Shift-3 (4112x2658) but is 10x larger (14.4MB vs 1.35MB).
My SCStreamConfiguration uses the SCDisplay's width and height and sets the color space to kCGColorSpaceSRGB.
I currently save to file by initializing a NSBitmapImageRep using initWithCGImage, then representing as PNG with representationUsingType NSBitmapImageFileTypePNG, then writeToFile:atomically.
Is there some configuration or compression I can use to bring down the PNG size to be more closely in-line with a Command-Shift-3 screenshot.
Thanks!
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
When I receive the InterruptionBegan notification (the interruption type is AVAudioSessionInterruptionTypeBegan) , I pause playing music.
When I receive the InterruptionEnded notification (the interruption type is AVAudioSessionInterruptionTypeEnded), I resume playing music.
however, sometimes i has got the error code: AVAudioSessionErrorCodeCannotInterruptOthers (560557684)
If some malicious app to take up the audio, which leads to the third party app music playback recovery fails, an error AVAudioSessionErrorCodeCannotInterruptOthers.
In this case, can we know which apps are maliciously hogging the audio?
Doing some testing around player behaviour when a license expires in Safari on MacOS. I had the following outcomes:
On Safari 17.4.1 on an intel based Mac running Ventura, playback stopped when the license expired.
On Safari 17.4.1 on an M3 running Sonoma, playback stalled briefly, and then continued to play unlimited.
On Safari 16.5.2 on an M2 running Ventura, playback stalled briefly, and then continued to play unlimited.
When playback stalled briefly, was at the time the license expired.
I parsed the license and everything is set correctly for a lease license type:
{
"version" : 1,
"payloadLength" : 1072,
"iv" : "0H9NCXLQeh1ziYpmJXsnwQ==",
"assetId" : "䙁㉌䉃\u0000\u0000\u0000",
"hdcp" : "TYPE_0_REQUIRED",
"contentKeyDuration" : {
"leaseDurationSeconds" : 300,
"rentalDurationSeconds" : 0,
"persistenceAllowed" : false
},
"keyType" : "Lease"
}
I cannot find any information relating to this behaviour.
Per the docs for FPS, a lease license type: If the content key is not renewed, the Apple device stops the playback when the lease expires. Which is what is observed on the intel based macbook.
This code to write UIImage data as heic works in iOS simulator with iOS < 17.5
import AVFoundation
import UIKit
extension UIImage {
public var heic: Data? { heic() }
public func heic(compressionQuality: CGFloat = 1) -> Data? {
let mutableData = NSMutableData()
guard let destination = CGImageDestinationCreateWithData(mutableData, AVFileType.heic as CFString, 1, nil),
let cgImage = cgImage else {
return nil
}
let options: NSDictionary = [
kCGImageDestinationLossyCompressionQuality: compressionQuality,
kCGImagePropertyOrientation: cgImageOrientation.rawValue,
]
CGImageDestinationAddImage(destination, cgImage, options)
guard CGImageDestinationFinalize(destination) else { return nil }
return mutableData as Data
}
public var isHeicSupported: Bool {
(CGImageDestinationCopyTypeIdentifiers() as! [String]).contains("public.heic")
}
var cgImageOrientation: CGImagePropertyOrientation { .init(imageOrientation) }
}
extension CGImagePropertyOrientation {
init(_ uiOrientation: UIImage.Orientation) {
switch uiOrientation {
case .up: self = .up
case .upMirrored: self = .upMirrored
case .down: self = .down
case .downMirrored: self = .downMirrored
case .left: self = .left
case .leftMirrored: self = .leftMirrored
case .right: self = .right
case .rightMirrored: self = .rightMirrored
@unknown default:
fatalError()
}
}
}
But with iOS 17.5 simulator it seems to be broken.
The call of CGImageDestinationFinalize
writes this error into the console:
writeImageAtIndex:962: *** CMPhotoCompressionSessionAddImage: err = kCMPhotoError_UnsupportedOperation [-16994] (codec: 'hvc1')
On physical devices it still seems to work.
Is there any known workaround for the iOS simulator?
So i meant to make a shared album but made a shared library. That being said, I deleted the shared album but my family can not remove it from the phones
I have an app that has the camera continuously running, as it is doing its own AI, have zero need for Apple'video effects, and am seeing a 200% performance hit after updating to Sonoma. The video effects are the "heaviest stack trace" when profiling my app with Instruments CPU profiler (see below).
Is forcing your software onto developers not something Microsoft would do? Is there really no way to opt out?
6671 Jamscape_exp (23038)
2697 start_wqthread
2697 _pthread_wqthread
2183 _dispatch_workloop_worker_thread
2156 _dispatch_root_queue_drain_deferred_wlh
2153 _dispatch_lane_invoke
2146 _dispatch_lane_serial_drain
1527 _dispatch_client_callout
1493 _dispatch_call_block_and_release
777 __88-[PTHandGestureDetector initWithFrameSize:asyncInitQueue:externalHandDetectionsEnabled:]_block_invoke
777 -[VCPHandGestureVideoRequest initWithOptions:]
508 -[VCPHandGestureClassifier initWithMinHandSize:]
508 -[VCPCoreMLRequest initWithModelName:]
506 +[MLModel modelWithContentsOfURL:configuration:error:]
506 -[MLModelAsset modelWithError:]
506 -[MLModelAsset load:]
506 +[MLLoader loadModelFromAssetAtURL:configuration:error:]
506 +[MLLoader _loadModelFromAssetAtURL:configuration:loaderEvent:error:]
505 +[MLLoader _loadModelFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:]
505 +[MLLoader _loadWithModelLoaderFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:]
505 +[MLLoader _loadModelFromArchive:configuration:modelVersion:compilerVersion:loaderEvent:useUpdatableModelLoaders:loadingClasses:error:]
505 +[MLLoader _loadModelWithClass:fromArchive:modelVersionInfo:compilerVersionInfo:configuration:error:]
445 +[MLMultiFunctionProgramEngine loadModelFromCompiledArchive:modelVersionInfo:compilerVersionInfo:configuration:error:]
333 -[MLMultiFunctionProgramEngine initWithProgramContainer:configuration:error:]
333 -[MLNeuralNetworkEngine initWithContainer:configuration:error:]
318 -[MLNeuralNetworkEngine _setupContextAndPlanWithConfiguration:usingCPU:reshapeWithContainer:error:]
313 -[MLNeuralNetworkEngine _addNetworkToPlan:error:]
313 espresso_plan_add_network
313 EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t)
313 EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t, std::__1::shared_ptrEspresso::net)
313 Espresso::load_network(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::compute_path, bool)
235 Espresso::reload_network_on_context(std::__1::shared_ptrEspresso::net const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::compute_path)
226 Espresso::load_and_shape_network(std::__1::shared_ptrEspresso::SerDes::generic_serdes_object const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptrEspresso::blob_storage_abstract const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&)
214 Espresso::load_network_layers_internal(std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::network_shape const&, std::__1::basic_istream<char, std::__1::char_traits>, Espresso::compute_path, bool, std::__1::shared_ptrEspresso::blob_storage_abstract const&)
208 Espresso::run_dispatch_v2(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, std::__1::basic_istream<char, std::__1::char_traits>)
141 try_dispatch(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, std::__1::basic_istream<char, std::__1::char_traits>, Espresso::platform const&, Espresso::compute_path const&)
131 Espresso::get_net_info_ir(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, Espresso::platform const&, Espresso::compute_path const&, std::__1::shared_ptrEspresso::cpu_context_transfer_algo_t&, std::__1::shared_ptrEspresso::net_info_ir_t&, std::__1::shared_ptrEspresso::kernels_validation_status_t&)
131 Espresso::cpu_context_transfer_algo_t::create_net_info_ir(std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, std::__1::shared_ptrEspresso::abstract_context, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t)
120 Espresso::cpu_context_transfer_algo_t::check_all_kernels_availability_on_context(std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, std::__1::shared_ptrEspresso::abstract_context&, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t&)
120 is_kernel_available_on_engine(unsigned long, std::__1::shared_ptrEspresso::base_kernel, Espresso::kernel_info_t const&, std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::shared_ptrEspresso::abstract_context, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t, std::__1::shared_ptrEspresso::kernels_validation_status_t)
83 Espresso::ANECompilerEngine::mix_reshape_kernel::is_valid_for_engine(std::__1::shared_ptrEspresso::kernels_validation_status_t, Espresso::base_kernel::validate_for_engine_args_t const&) const
45 int ValidateLayer<ANECReshapeLayerDesc, ZinIrReshapeUnit, ZinIrReshapeUnitInfo, ANECReshapeLayerDescAlternate>(void, ANECReshapeLayerDesc const*, ANECTensorDesc const*, unsigned long, unsigned long*, ANECReshapeLayerDescAlternate**, ANECTensorValueDesc const*)
45 void ValidateLayer_Impl<ANECReshapeLayerDesc, ZinIrReshapeUnit, ZinIrReshapeUnitInfo, ANECReshapeLayerDescAlternate>(void*, ANECReshapeLayerDesc const*, ANECTensorDesc const*, unsigned long, unsigned long*, ANECReshapeLayerDescAlternate**, ANECTensorValueDesc const*)
(...)
Hi guys,
I'm investigating failure to play low latency Live HLS stream and I'm getting following error:
(String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017
The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2.
Regular Live streams and VOD streams work normally on those CDN servers.
I tried to configure TLSv1.2 in Info.plist, but that didn't help.
When running
nscurl --ats-diagnostics --verbose
it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost."
Is TLS 1.3 required or just recommended?
Refering to
https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls
and
https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis
Is it possible to configure AVPlayer to skip ECN and SACK validation?
Thanks.
Hello everyone, I have been receiving this same crash report for the past month whenever I try and export a Final Cut Pro project. The FCP video will get to about 88% completion of export, then the application crashes and I get the attached report. Any leads on how to fix this would be greatly appreciated! Thank you.
-Lauren
The above is the extra_data in the lhvC box in the 3D format of Apple's Vision Pro, which only contains sps/pps;
I can know that 0xa1 is sps nultype, 0x00 01 is the number of sps, and 0x00 17 is the length.
But what is the 0x01 f0 00 fc c3 02 at the beginning, and I can't find the corresponding definition.
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I'm trying to expose my native shazamkit code to the host react native app.
The implementation works fine in a separate swift project but it fails when I try to integrate it into a React Native app.
Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' was thrown while invoking exposed on target ShazamIOS with params (
1682,
1683
)
callstack: (
0 CoreFoundation 0x00007ff80049b761 __exceptionPreprocess + 242
1 libobjc.A.dylib 0x00007ff800063904 objc_exception_throw + 48
2 CoreFoundation 0x00007ff80049b56b +[NSException raise:format:] + 0
3 AVFAudio 0x00007ff846197929 _Z19AVAE_RaiseExceptionP8NSStringz + 156
4 AVFAudio 0x00007ff8461f2e90 _ZN17AUGraphNodeBaseV318CreateRecordingTapEmjP13AVAudioFormatU13block_pointerFvP16AVAudioPCMBufferP11AVAudioTimeE + 766
5 AVFAudio 0x00007ff84625f703 -[AVAudioNode installTapOnBus:bufferSize:format:block:] + 1456
6 muse 0x000000010a313dd0 $s4muse9ShazamIOSC6record33_35CC2309E4CA22278DC49D01D96C376ALLyyF + 496
7 muse 0x000000010a313210 $s4muse9ShazamIOSC5startyyF + 288
8 muse 0x000000010a312d03 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtF + 83
9 muse 0x000000010a312e47 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtFTo + 103
10 CoreFoundation 0x00007ff8004a238c __invoking___ + 140
11 CoreFoundation 0x00007ff80049f6b3 -[NSInvocation invoke] + 302
12 CoreFoundation 0x00007ff80049f923 -[NSInvocation invokeWithTarget:] + 70
13 muse 0x000000010a9210ef -[RCTModuleMethod invokeWithBridge:module:arguments:] + 2495
14 muse 0x000000010a925cb4 _ZN8facebook5reactL11invokeInnerEP9RCTBridgeP13RCTModuleDatajRKN5folly7dynamicEiN12_GLOBAL__N_117SchedulingContextE + 2036
15 muse 0x000000010a925305 _ZZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEiENK3$_0clEv + 133
16 muse 0x000000010a925279 ___ZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEi_block_invoke + 25
17 libdispatch.dylib 0x000000010e577747 _dispatch_call_block_and_release + 12
18 libdispatch.dylib 0x000000010e5789f7 _dispatch_client_callout + 8
19 libdispatch.dylib 0x000000010e5808c9 _dispatch_lane_serial_drain + 1127
20 libdispatch.dylib 0x000000010e581665 _dispatch_lane_invoke + 441
21 libdispatch.dylib 0x000000010e58e76e _dispatch_root_queue_drain_deferred_wlh + 318
22 libdispatch.dylib 0x000000010e58db69 _dispatch_workloop_worker_thread + 590
23 libsystem_pthread.dylib 0x000000010da67b84 _pthread_wqthread + 327
24 libsystem_pthread.dylib 0x000000010da66acf start_wqthread + 15
)
RCTFatal
facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext)
facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const
invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)
This is my swift file, error happens in the record function.
import Foundation
import ShazamKit
@objc(ShazamIOS)
class ShazamIOS : NSObject {
@Published var matching: Bool = false
@Published var mediaItem: SHMatchedMediaItem?
@Published var error: Error? {
didSet {
hasError = error != nil
}
}
@Published var hasError: Bool = false
private lazy var audioSession: AVAudioSession = .sharedInstance()
private lazy var session: SHSession = .init()
private lazy var audioEngine: AVAudioEngine = .init()
private lazy var inputNode = self.audioEngine.inputNode
private lazy var bus: AVAudioNodeBus = 0
override init() {
super.init()
session.delegate = self
}
@objc
func exposed(_ resolve:RCTPromiseResolveBlock, reject:RCTPromiseRejectBlock){
start()
resolve("ios code executed")
}
func start() {
switch audioSession.recordPermission {
case .granted:
self.record()
case .denied:
DispatchQueue.main.async {
self.error = ShazamError.recordDenied
}
case .undetermined:
audioSession.requestRecordPermission { granted in
DispatchQueue.main.async {
if granted {
self.record()
}
else {
self.error = ShazamError.recordDenied
}
}
}
@unknown default:
DispatchQueue.main.async {
self.error = ShazamError.unknown
}
}
}
private func record() {
do {
self.matching = true
let format = self.inputNode.outputFormat(forBus: bus)
self.inputNode.installTap(onBus: bus, bufferSize: 8192, format: format) { [weak self] (buffer, time) in
self?.session.matchStreamingBuffer(buffer, at: time)
}
self.audioEngine.prepare()
try self.audioEngine.start()
}
catch {
self.error = error
}
}
func stop() {
self.audioEngine.stop()
self.inputNode.removeTap(onBus: bus)
self.matching = false
}
@objc
static func requiresMainQueueSetup() -> Bool {
return true;
}
}
extension ShazamIOS: SHSessionDelegate {
func session(_ session: SHSession, didFind match: SHMatch) {
DispatchQueue.main.async { [self] in
if let mediaItem = match.mediaItems.first {
self.mediaItem = mediaItem
self.stop()
}
}
}
func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) {
DispatchQueue.main.async {[self] in
self.error = error
self.stop()
}
}
}
objC file
#import <Foundation/Foundation.h>
#import "React/RCTBridgeModule.h"
@interface RCT_EXTERN_MODULE(ShazamIOS, NSObject);
RCT_EXTERN_METHOD(exposed:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject)
@end
how I consume the exposed function in RN.
const {ShazamModule, ShazamIOS} = NativeModules;
const onPressIOSButton = () => {
ShazamIOS.exposed().then(result => console.log(result)).catch(e => console.log(e.message, e.code));
};
I'm using the systemMusicPlayer to play music and want to update the playback time using addObserver forKeyPath.
[self setMusicPlayer: [MPMusicPlayerController systemMusicPlayer]];
I've tried these two methods:
[self addObserver:self forKeyPath:@"musicPlayer.currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
[self.musicPlayer addObserver:self forKeyPath:@"currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
I do get the initial values for currentPlaybackTime in:
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
but I never get any calls when the player is playing the song (the whole point).
If I set the currentPlaybackTime to a specific value (locating manually using a slider), I get calls with the values I set (useless since I know what I am setting them to).
How are we supposed to track the playback time without just polling it constantly?
Hi everyone, I am having a problem on AVPlayer when I try to play some videos.
The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error:
<__NSArrayM 0x14dbf9f30>(
{
StreamPlaylistError = "-12314";
comment = "have audio audio-aacl-54 in STREAMINF without EXT-X-MEDIA audio group";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
},
{
"c-conn-type" = 1;
"c-severity" = 2;
comment = "Playlist parse error";
"cs-guid" = "871C1871-D566-4A3A-8465-2C58FDC18A19";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
}
)
I connect two AVAudioNodes by using
- (void)connectMIDI:(AVAudioNode *)sourceNode to:(AVAudioNode *)destinationNode format:(AVAudioFormat * __nullable)format eventListBlock:(AUMIDIEventListBlock __nullable)tapBlock
and add a AUMIDIEventListBlock tap block to it to capture the MIDI events.
Both AUAudioUnits of the AVAudioNodes involved in this connection are set to use MIDI 1.0 UMP events:
[[avAudioUnit AUAudioUnit] setHostMIDIProtocol:(kMIDIProtocol_1_0)];
But all the MIDI voice channel events received are automatically converted to UMP MIDI 2.0 format. Is there something else I need to set so that the tap receives MIDI 1.0 UMPs?
(Note: My app can handle MIDI 2.0, so it is not really a problem. So this question is mainly to find out if I forgot to set the protocol somewhere...).
Thanks!!
I'm trying to cast the screen from an iOS device to an Android device.
I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression.
While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android.
Data transmission over the TCP socket seems to be functioning correctly.
My question is:
Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms?
Here's a breakdown of the iOS sender details:
Device: iPhone 13 mini running iOS 17
Development Environment: Xcode 15 with a minimum deployment target of iOS 16
Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers
Video Compression: VideoToolbox for H.264 compression
Compression Properties:
kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate)
kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level)
kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval)
kVTCompressionPropertyKey_RealTime: true (real-time encoding)
kVTCompressionPropertyKey_Quality: 1 (lowest quality)
NAL Unit Handling: Custom header is added to NAL units
Android Receiver Details:
Device: RedMi 7A running Android 10
Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
How we can implement music kit in flutter app?
I'm trying to generate a developer token with the following code in nodeJS:
const jwt = require('jsonwebtoken')
const {TEAM_ID, KID, APPLE_PRIVATE_KEY } = require('./secret')
const expiration = 36000;
const currentTime = Math.floor(Date.now() /1000);
const expirationTime = currentTime + expiration;
const options = {
algorithm: 'ES256',
header :{
alg : "ES256",
kid : KID
}
}
const payload = {
iss : TEAM_ID,
iat : currentTime,
exp : expirationTime
}
const newToken = jwt.sign(payload, APPLE_PRIVATE_KEY, options)
console.log('1111111111111111111111' , newToken)
When testing my newToken in curl -- I'm getting a 401 response.
Please help.
I have generated FCPXML, but i can't figure out issue:
<?xml version="1.0"?>
<fcpxml version="1.11">
<resources>
<format id="r1" name="FFVideoFormat3840x2160p2997" frameDuration="1001/30000s" width="3840" height="2160" colorSpace="1-1-1 (Rec. 709)"/>
<asset id="video0" name="11a(1-5).mp4" start="0s" hasVideo="1" videoSources="1" duration="6.81s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/11a(1-5).mp4"/>
</asset>
<asset id="video1" name="12(4)r8 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="9.94s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/12(4)r8 mute.mp4"/>
</asset>
<asset id="video2" name="13 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="6.51s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13 mute.mp4"/>
</asset>
<asset id="video3" name="13x (8,14,24,29,38).mp4" start="0s" hasVideo="1" videoSources="1" duration="45.55s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13x (8,14,24,29,38).mp4"/>
</asset>
</resources>
<library>
<event name="Untitled">
<project name="Untitled Project" uid="28B2D4F3-05C4-44E7-8D0B-70A326135EDD" modDate="2024-04-17 15:44:26 -0400">
<sequence format="r1" duration="4802798/30000s" tcStart="0s" tcFormat="NDF" audioLayout="stereo" audioRate="48k">
<spine>
<asset-clip ref="video0" offset="0/10000s" name="11a(1-5).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video1" offset="12119/10000s" name="12(4)r8 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video2" offset="22784/10000s" name="13 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video3" offset="34544/10000s" name="13x (8,14,24,29,38).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
</spine>
</sequence>
</project>
</event>
</library>
</fcpxml>
Any ideas?
Is there any update for iOS17 in FairPlay? So far I have found no relevant information.
iOS 16 is fine
Hi,
I am developing iOS mobile camera. I noticed one issue related to the user privacy. when AVCaptureVideoStabilizationModeStandard is set to AVCaptureConnection which sessionPreset is 1920x1080Preset, after using system API to take a photo, the FOV of the photo will be bigger than preview stream and it will show more content especially in iPhone 15 pro max rear camera. I think this inconsistency will cause the user privacy issue. Can you show me the solution if I don't want to turn the StabilizationMode OFF? I tried other devices, this issue is ok but in iPhone 15pm this issue is very obvious.
Any suggestions are appreciated.