Hello Apple,
i've been using ios for many years and never had any issues with urdu language keyboard, but since the new 18.4 beta update some words are not working correctly as it should like a name of my friend who's name is "راعنیہ" but the new updated version cannot type is together and keep seperating like "راعنی ہ" its so frustrating to use like that and its not just one but so many other words that it just cannot do properly also the new font and no gap concept its hurting my eyes so much while reading or even typing.. i hope apple fixes that asap..
thankyou
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello Apple Developers,
I’m reaching out to the community with a concept that I truly believe could be a natural fit for the Apple ecosystem:
A privacy-focused, iOS-exclusive dating app designed to enhance connections between Apple users while staying true to Apple’s commitment to security and user privacy.
The idea is to create an iOS-only dating platform that fosters relationships between users who are part of the Apple ecosystem. The app would integrate seamlessly with Apple’s services (iMessage, FaceTime, Siri, etc.) and provide a premium user experience, where privacy is a priority.
Apple users already prefer to communicate using Apple services (iMessage, FaceTime). A dating app designed specifically for iOS users would deepen this ecosystem lock-in, making it easier for Apple customers to connect within a trusted space.
Apple is already known for its privacy focus, and an iOS-exclusive dating app would build upon that reputation. It would ensure secure, private interactions, minimizing the risks associated with data sharing in most dating apps today.
The app could integrate directly with features like iCloud, Apple Pay (for date-night bookings), and Siri (for matchmaking suggestions), offering users a truly native iOS experience.
While the app would remain free to use, here are a few potential monetization methods:
Bundling with Apple One/iCloud+ for premium matchmaking features.
Apple Pay-based date-night deals with local partners.
I’d love to hear your thoughts on whether Apple might be open to this idea. Would there be any challenges from a technical or business perspective in creating a dating app exclusively for iOS users?
I’m looking forward to hearing from you all, and thank you for your time and insights.
Yours Truly,
CapNKirk
P.S. This is an idea. But I do not care who uses, implements, or executes this idea. I just want to see Apple take advantage of it.
When I use musicKit SDK for Android 1.1.2, I found that MediaContainerType only defines three types:
NONE = 0;
ALBUM = 1;
PLAYLIST = 2;
The RADIO_STATION type is not defined.
However, the documentation of com.apple.android.music.playback.model states that the RADIO_STATION type is supported.
This problem causes an error after I pass in the stations ID:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
May I ask how to solve this problem?
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型:
无 = 0;
专辑 = 1;
播放列表 = 2;
未定义 RADIO_STATION 类型。
但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。
此问题在我传入 stations ID 后会导致错误:
MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException
请问如何解决这个问题?
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync.
I don t know if it s a AVAssetWriterInputr setup problem,here is my code
NSDictionary *audioCompressionSettings = @{
AVEncoderBitRatePerChannelKey : @(64000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(2),
AVSampleRateKey : @(44100) };
AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
audioAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:audioAssetWriterInput];
NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5),
AVVideoMaxKeyFrameIntervalKey:@(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel};
NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264,
AVVideoScalingModeKey : AVVideoScalingModeResize,
AVVideoWidthKey:@(screenWidth*2),
AVVideoHeightKey:@(screenHeight*2),
AVVideoCompressionPropertiesKey:videoCompressSetting
};
AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting];
videoAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:videoAssetWriterInput];
Hello,
How do I find the apple music catalogID for songs in a apple music playlist?
Im building an iOS app that uses MusicKit/Apple Music API.
For this example, you can assume that my iOS app simply allows users to upload their apple music playlists. And when users open a specific playlist, I want to find the catalogID for each song.
Currently all the songs are returning song IDs in this format “i.PkdJvPXI2AJgm8”.
I believe these are libraryIDs, not catalogIDs.
I’d like to find a front end solution using MusicKit. But perhaps a back end solution using the Apple Music Rest API is required.
Any recommendations would be appreciated!
Hey, just wonderng if anyone knows whether the sdk for Android will be updated soon to support the new 16 KB memory page size requirement coming with Android 15?
Google’s going to require all apps targeting Android 15+ to support it starting November 2025
Has anyone heard anything from Apple or the SDK team about this?
Thanks!
I’m spot-checking some of the data I’m extracting from the Apple Music Feed parquet files and finding numerous issues of invalid album IDs.
For example, just looking at any album with a primary artist id of 163043, I see a few albums that are not available at music.apple.com/us/album/NNN. These include:
981094158 - Time and the River
1803443737 - Celebration (Live New York ’80)
1525426873 - Anything You Want: The Warner-Reprise-Elektra Years
I notice that neither of these album ids are returned using the general Music API, so I’m a bit confused why they would exist in the parquet files at all.
Thanks.
Topic:
Media Technologies
SubTopic:
General
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call:
-[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12.
I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem.
Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter
Feedback ID: FB17874311
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
I have an app (currently in development stage) which needs to use ffmpeg, so I tried searching how to embed ffmpeg in apple apps and found this article https://doc.qt.io/qt-6/qtmultimedia-building-ffmpeg-ios.html
It is working correctly for iOS but not for macOS ( I have made changes macOS specific using chatgpt and traditional web searching)
Drive link for the file and instructions which I'm following: https://drive.google.com/drive/folders/11wqlvb8SU2thMSfII4_Xm3Kc2fPSCZed?usp=share_link
Please can someone from apple or in general help me to figure out what I'm doing wrong?
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
In iOS 26, AVSpeechSynthesizer read Mandarin into Cantonese pronunciation.
No matter how you set the language, and change the settings of my phone system, it doesn't work.
let utterance = AVSpeechUtterance(string: "你好啊")
//let voice = AVSpeechSynthesisVoice(language: "zh-CN") // not work
let voice = AVSpeechSynthesisVoice(language: "zh-Hans") // not work too
utterance.voice = voice
et synth = AVSpeechSynthesizer()
synth.speak(utterance)
Topic:
Media Technologies
SubTopic:
General
Tags:
Speech
Internationalization
Localization
AVFoundation
Queria saber quando lança o IOS 26 oficialmente
I wanted to know when iOS 26 will be officially released.
Topic:
Media Technologies
SubTopic:
General
Hi,
In the iOS13 and macOS Catalina release notes it says:
Metal CIKernel instances now support arguments with arbitrarily structured data.
I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result.
I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are.
In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues.
Thanks!
Michael
Has anyone been able to successfully use MusicCatalogSearchRequest in a playgrounds app?
I have configured my playground similar to a regular app: app id with automatic music token generation turned on, music access authorized within the app itself, but whenever I query MusicCatalogSearchRequest I get an error thrown with .developerTokenRequestFailed.
Considering musickit is restricted in the sim, it would not surprise me if it was the same in playgrounds but it would be super helpful if I could prototype with musickit in playgrounds 4!
Hi -
Of course I may be doing something wrong, but I'm getting exactly the opposite of what I would expect from
ApplicationMusicPlayer.shared.state.playbackStatus
It returns .playing when the music is paused and .paused when the music is playing.
Am I holding it wrong?
Thanks,
Daniel
Add RPSystemBroadcastPickerView to the app,
After clicking, no method of SampleHandler is triggered
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!