iPhoneやiPadにおいて、画面上のボタンなどをタップした際に、特定の楽器音を発音させる方法をご存知の方いらっしゃいませんか?
現在音楽学習アプリを作成途中で、画面上の鍵盤や指板のボタン状のframeに、単音又は和音を割当て発音させる事を考えております
SwiftUIのcodeのみで実現できないでしょうか
嘗て、MIDIのlevel1の楽器の発音機能があった様に記憶していますが、現在のOS上では同様の機能を実装してないように思えます
皆様のお知恵をお貸しください
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I try to validate low latency HLS fragmented MP4 setup with meadistreamvalidator. I get following error:
Error: Invalid URL
Detail: '(null)' is not a valid URL
Source: mediaplaylistURL.m3u8 - segmentURL.mp4
meadistreamvalidator version is 1.23.14
What does that error mean?
Topic:
Media Technologies
SubTopic:
Streaming
I have a SwiftUI app - (https://youtu.be/VbAfUk_eYl0?si=JxUBh0Bpb-vc1E1U) - which I thought was almost ready for release - a manager for airdropped audio files from Logic Pro or other music creation applications. It uses AVAudioEngine and AVAudioPlayerNode to play audio, and the MediaPlayer API to integrate with car audio and similar, all of which works well.
It does not currently have an explicit CarPlay integration (and I'm slightly horrified at the amount of work that is going to require).
I had the good or bad luck of getting a loaner car with carplay while mine is being repaired yesterday, and lo and behold, when connected to the vehicle via CarPlay, there is no audio output in the vehicle at all. The now playing panel correctly shows the information my app provides about the currently playing song; the player node believes it is playing, the AVAudioSession is configured as it should be. But there is no sound.
Obviously I cannot ship it in this state.
I've tried fiddling with the parameters the AVAudioSession is configured with, in case there was some parameter that was preventing audio output, to no avail - currently:
var options = AVAudioSession.CategoryOptions()
options.insert(.allowAirPlay)
options.insert(.allowBluetooth)
options.insert(.allowBluetoothA2DP)
try session.setCategory(.playback, mode: .default, options: options)
try? session.setPreferredIOBufferDuration(0.002) // ~96 samples at 44.1kHz
try? session.setPrefersNoInterruptionsFromSystemAlerts(true)
try? session.setPrefersInterruptionOnRouteDisconnect(false)
try session.setActive(true, options: [.notifyOthersOnDeactivation])
All diagnostics within the app show the player operating correctly - files are played and flushed; AVAudioPlayerNodeCompletionCallbacks are called when they should be. But the output is not audible in the vehicle.
I would much prefer to ship this app without full-blown CarPlay integration, but with working audio when connected via CarPlay, and work on full CarPlay integration for the next release.
Is there some secret handshake I am just missing to make this work?
I'm developing the VisionOS app. I want to know how to play spatial audio in addition to RealityKit? If it's iOS or macOS, how to play spatial audio in addition to RealityKit?
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths.
Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures.
Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
HTTP Live Streaming
AVFoundation
Hello, hope everybody is doing well. I have some reels (of aspect ratio 9X16) content, which I want to playback on iOS phones.
My question is does AV player support out of box playback of encrypted Mp4. Please note, this is not HLS fMp4, rather unfragmented Mp4 content.
If it is supported, what algorithm of encryption shall be used? Please let me know.
Topic:
Media Technologies
SubTopic:
Streaming
Hi,
I am looking for a good way to play sounds at a high frequency.
At the moment I am using the AVAudioEngine, and create a couple AVAudioPlayerNode and for each sound I need to play I create a AVAudioPCMBuffer.
When the app needs to play a sound, I get the correct AVAudioPCMBuffer for the sound and use the first available AVAudioPlayerNode and feed it to the buffer.
The timing for a metronome app has to be very precise because if it's of by about 16ms the user can hear that it is not playing had the right interval. For low speeds this is working without any problems, but at high speeds it is getting worse.
Maybe anyone has an idea on how I can improve my method.
Its a Plugin for Flutter.
import AVFoundation
class FastSoundPlayer {
private var audioPlayers: [SoundPlayer?] = []
private var sounds: [String: Sound] = [:]
private var engine = AVAudioEngine()
let session = AVAudioSession.sharedInstance()
init() {
do {
try session.setCategory(AVAudioSession.Category.playback, mode: AVAudioSession.Mode.default, options: [AVAudioSession.CategoryOptions.mixWithOthers])
try session.setActive(true)
createSoundPlayers(count: 20)
try engine.start()
} catch {
print("Error starting audio engine: \(error.localizedDescription)")
}
}
// Selector method to handle applicationDidBecomeActiveNotification
func applicationDidBecomeActive() {
// Reinitialize AVAudioEngine and reattach all nodes
do {
engine.reset()
objc_sync_enter(audioPlayers)
audioPlayers.removeAll()
createSoundPlayers(count: 20)
objc_sync_exit(audioPlayers)
try engine.start()
} catch {
print("Error starting audio engine: \(error.localizedDescription)")
}
}
func createSoundPlayers(count: Int) {
for _ in 0..<count {
let player = SoundPlayer()
engine.attach(player.player)
engine.connect(player.player, to: engine.mainMixerNode, format: nil)
audioPlayers.append(player)
}
}
func load(sound: Data, name: String) {
let sound = Sound(soundData: sound)
sounds[name] = sound
}
func play(name: String) {
if !engine.isRunning {
applicationDidBecomeActive()
}
guard let sound = sounds[name] else {
print("Sound not found")
return
}
if let player = getAvailablePlayer() {
player.play(sound: sound)
}
}
func getAvailablePlayer() -> SoundPlayer? {
for player in audioPlayers {
if !player!.isPlaying {
return player
}
}
return nil
}
}
class SoundPlayer {
let player = AVAudioPlayerNode()
var isPlaying = false
init() {
player.volume = 1.0
}
func play(sound: Sound) {
player.scheduleBuffer(sound.sound!, at: nil, options: .interrupts, completionCallbackType: .dataPlayedBack) { _ in
self.complete()
}
if (player.engine != nil && player.engine!.isRunning) {
player.play()
isPlaying = true
}
}
func complete() {
isPlaying = false
}
}
class Sound {
var sound: AVAudioPCMBuffer?
init(soundData: Data) {
do {
let temporaryURL = FileManager.default.temporaryDirectory.appendingPathComponent("tempSound.wav")
try soundData.write(to: temporaryURL)
// Create AVAudioFile from the temporary file URL
let audioFile = try AVAudioFile(forReading: temporaryURL)
// Define the format for the PCM buffer (44100Hz, stereo)
let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: 2, interleaved: false)
// Create AVAudioPCMBuffer
guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: AVAudioFrameCount(audioFile.length)) else {
// Failed to create PCM buffer
self.sound = nil
return
}
// Read audio file into PCM buffer
try audioFile.read(into: pcmBuffer)
// Assign the created AVAudioPCMBuffer to the sound property
self.sound = pcmBuffer
} catch {
print("Error loading sound file: \(error.localizedDescription)")
self.sound = nil
}
}
}
Thanks!
Hello,
I am trying to follow the getting started guide. I have produced a developer token via the music kit embedding approach and can confirm I'm successfully authorized.
When I try to do play music, I'm unable to hear anything. Thought it could be some auto-play problems with the browser, but it doesn't appear to be related, as I can trigger play from a button with no further success.
const music = MusicKit.getInstance()
try {
await music.authorize() // successful
const result = await music.api.music(`/v1/catalog/gb/search`, {
term: 'Sound Travels',
types: 'albums',
})
await music.play()
} catch (error) {
console.error('play error', error) // ! No error triggered
}
I have searched the forum, have found similar queries but apparently none using V3 of the API.
Other potentially helpful information:
OS: macos 15.1 (24B83)
API version: V3
On localhost
Browser: Arc (chromium based), also tried on Safari,
The only difference between the two browsers is that safari appears to exit the breakpoint, whereas Arc will continue (without throwing any errors)
authorizationStatus: 3
Side note, any reason this is still in beta so many years later?
If I want to edit image in preview app. But there is only option to rotate left and right 90degree rotations. No option to rorate in any prticular angle. So Please look into this and provide option in next update
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Image I/O
Graphics and Games
App Review
Media
Hello everyone,
I’m working on an iOS app that fetches videos from the "Recently Deleted" album using the Photos framework in Swift. However, I’m unable to fetch any videos, even though the "Recently Deleted" album contains 233 items (including videos), as seen in the Photos app.
Environment:
iOS Version: 18.3.1
Xcode Version: 16.2
Swift Version: Swift 5
Device: iPhone (simulator and physical device both tested)
Photo Library Permission: "All Photos" access granted
Recently Deleted Lock: Face ID/Passcode is disabled for "Recently Deleted"
I'm using an iPhone 15 Pro, which has switched from Lightning to USB Type-C. My iOS version is 18.3. According to Apple's documentation, AVCaptureDevice.DeviceType should support external device types.
🔗 Apple's Official Documentation:
https://developer.apple.com/documentation/avfoundation/avcapturedevice/devicetype-swift.struct/external
The documentation clearly states that iPadOS 17.0+ and iOS 17.0+ support external devices. However, in my actual tests:
On iPhone, discoverySession does not detect any external devices.
On iPad, discoverySession can detect external devices without any issues.
My Question:
Does iPhone USB-C actually support external devices (e.g., UVC cameras)?
If not, why does Apple's documentation claim that iOS 17 supports external devices instead of specifying iPadOS 17 only?
I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths.
Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures.
Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Cloud and Local Storage
HTTP Live Streaming
AVFoundation
I'm a new app developer and am trying to add a button that adds pictures from the photo library AND camera. I added the first function (adding pictures from the photo library) using the new-ish photoPicker, but I can't find a way to do the same thing for the camera. Should I just tough it out and use the UI View Controller struct that I've seen in all of the YouTube tutorials I've come across?
I also want the user to be able to crop the picture in the app after they take a picture.
Thanks in advance
I have an SCStreamDelegate for capturing frames from applications. On recent point releases of macOS Sonoma, I've noticed that the stream is being cancelled with no user action being taken. I started trying to debug it and when my on error method is called, the error parameter being passed is null:
func stream(_ stream: SCStream, didStopWithError error: Error) {
/*debugger shows this and segfaults if I try to print "\(error)"
error (Error)
> error = (Builtin.RawPointer) 0x0
*/
From what I can tell, error should be a valid NSError so I can check the error code, based on similar code I've seen in, for example OBS (https://github.com/obsproject/obs-studio/blob/265239d4174f8d291b0de437088c5b78f8e27687/plugins/mac-capture/mac-sck-common.m#L29)
Usually when this happens, the menubar icon for screen sharing (where I would click to change sharing window, etc) stays there even after my app has closed an no apps are doing sharing stuff.
Has anyone come across this before? Am I misinterpreting what the debugger is saying about the error parameter?
I'm running macos 14.7.3, but I just updated from 14.7.2 earlier and had basically the same issue on both macos versions
I'm developing a tutorial style tvOS app with multiple videos. The examples I've seen so far deal with only one video.
Defining the player and source(url) before body view
let avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../video.mp4")!))
and then in the body view the video is displayed
VideoPlayer(player: avPlayer)
This allows options such as stop/start etc.
When I try something similar with a video title passed into this view I can't define the player with this title variable.
var vTitle: String
var avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../" + vTitle + ".mp4"")!))
var body: some View {
I het an error that vTitle can't be used in the url above the body view.
Any thoughts or suggestions? Thanks
I have a complex CoreImage pipeline which I'm keen to optimise. I'm aware that calling back to the CPU can have a significant impact on the performance - what is the best way to find out where this is happening?
The problem I have at the moment is that if a phone call comes in during my recording, even if I don't answer, my recording will be interrupted
The phenomenon of recording interruption is that the picture is stuck, and the recording can be resumed automatically after the call is over. But it will cause the recorded video sound and painting out of sync
Through the AVCaptureSessionWasInterrupted listening, I can get to record the types of alerts and interrupt
As far as I can tell, a ringing or vibrating phone can block the audio channel. I found the same scenario in other apps, you can turn off the ring tone or vibration, but I don't know how to do it, I tried a lot of ways, but it doesn't work
BlackmagicCam or ProMovie App, when a call comes in during recording, there will only be a notification menu, and there will be no ringtone or vibration, which solves the problem of recording interruption
I don't know if this requires some configuration or application, please let me know if it does
Hi I just released an app which is live. i have a strange issue: while the audio files in the app play fine on my device, but some users are unable to hear. One friend said it played yesterday but not today. Any idea why? The files are mp3, I see them in Build Phase, and in the project obviously. Here's the audio view code, thank you!
import AVFoundation
struct MeditationView: View {
@State private var player: AVAudioPlayer?
@State private var isPlaying = false
@State private var selectedMeditation: String?
var isiPad = UIDevice.current.userInterfaceIdiom == .pad
let columns = [GridItem(.flexible()),GridItem(.flexible())]
let tracks = ["Intro":"intro.mp3",
"Peace" : "mysoundbath1.mp3",
"Serenity" : "mysoundbath2.mp3",
"Relax" : "mysoundbath3.mp3"]
var body: some View {
VStack{
VStack{
VStack{
Image("dhvani").resizable().aspectRatio(contentMode: .fit)
.frame(width: 120)
Text("Enter the world of Dhvani soundbath sessions, click lotus icon to play.")
.font(.custom("Times New Roman", size: 20))
.lineLimit(nil)
.multilineTextAlignment(.leading)
.fixedSize(horizontal: false, vertical: true)
.italic()
.foregroundStyle(Color.ashramGreen)
.padding()
}
LazyVGrid(columns:columns, spacing:10){
ForEach(tracks.keys.sorted(),id:\.self){ track in
Button {
self.playMeditation(named: tracks[track]!)
} label: {
Image("lotus")
.resizable()
.frame(width: 40,height: 40)
.background(Color.ashramGreen)
.cornerRadius(10)
}
Text(track)
.font(.custom("Times New Roman", size: 22))
.foregroundStyle(Color.ashramGreen)
.italic()
}
}
HStack(spacing:20) {
Button(action: { self.togglePlayPause() }) {
Image(systemName: isPlaying ? "playpause.fill" : "play.fill")
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(Color.ashramGreen)
}
Button(action: {
self.stopMeditation()
}) {
Image(systemName: "stop.fill")
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(Color.ashramGreen)
}
}
}.padding()
.background(Color.ashramBeige)
.cornerRadius(20)
Spacer()
//video play
VStack{
Text("Chant")
.font(.custom("Times New Roman", size: 24))
.foregroundStyle(Color.ashramGreen)
.padding(5)
WebView(urlString: "https://www.youtube.com/embed/ny3TqP9BxzE") .frame(height: isiPad ? 400 : 200)
.cornerRadius(10)
.padding()
Text("Courtesy Sri Ramanasramam").font(.footnote).italic()
}
}.background(Color.ashramBeige)
}
//View
func playMeditation(named name: String) {
if let url = Bundle.main.url(forResource: name, withExtension: nil) {
do {
player = try AVAudioPlayer(contentsOf: url)
player?.play()
isPlaying = true
} catch {
print("Error playing meditation")
}
}
}
func togglePlayPause() {
if let player = player {
if player.isPlaying {
player.pause()
isPlaying = false
} else {
player.play()
isPlaying = true
}
}
}
func stopMeditation() {
player?.stop()
isPlaying = false
}
}
#Preview {
MeditationView()
}
Topic:
Media Technologies
SubTopic:
Audio
Hello all! I've been having this issue for a while, on my iPhone 12 Pro.
The volume when listening to music, watching YouTube, TikTok, etc. It will randomly lower, but the actual audio slider won't it will still be at max volume but get very quiet. I've followed other instructions such as turn off audio awareness, and other settings but nothing seems to be working. And phone calls too Has anyone else had this issue and managed to fix it?
Topic:
Media Technologies
SubTopic:
Audio
Hello,
I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device:
dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture'
Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib'
Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file)
Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up.
Thank you!
Topic:
Media Technologies
SubTopic:
Photos & Camera