I'm trying to record video, with audio input coming from a connected Bluetooth headset. However I haven't been able to get an AVCaptureDevice that represents the headset, so I can't add it to my AVCaptureSession.I tried to find it using a discovery session, but it never finds the headset-- only the built in microphone. That may not be surprising since I'm asking for .builtInMicrophone, but the enumeration has no members for external audio inputs. I tried leaving the device type array empty but that finds no devices.let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInMicrophone], mediaType: AVMediaType.audio, position: .unspecified)
print("Found \(discoverySession.devices.count) devices")
for device in discoverySession.devices {
print("Device: \(device)")
}After doing some searching I tried setting up the AVAudioSession to specifically allow Bluetooth. However this has had no effect on the above.private let session = AVCaptureSession()
// later...
session.usesApplicationAudioSession = true
session.automaticallyConfiguresApplicationAudioSession = false
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.allowBluetooth])
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print("Error messing with audio session: \(error)")
}For completeness I also tried the deprecated AVCaptureDevice.devices() method, but it doesn't find the Bluetooth headset either.I know that the headset is available because AVAudioSession can see it. However I haven't been able to get from the availableInputs array to something I can use in an AVCaptureSession. The following code finds the headset, but what would I do with the result? It's not an AVCaptureDevice nor can I construct one from the entries in the array. Setting the "preferred" input doesn't have any effect that I know how to use.if let availableInputs = AVAudioSession.sharedInstance().availableInputs {
print("Found \(availableInputs.count) inputs")
for input in availableInputs {
print("Input: \(input)")
if input.portType == AVAudioSessionPortBluetoothHFP {
print("Setting preferred input")
do {
try AVAudioSession.sharedInstance().setPreferredInput(input)
} catch {
print("Error setting preferred input: \(error)")
}
}
}
}Given that the Bluetooth headset is connected and available, how do I set it as the audio input for my capture session?
Post not yet marked as solved
I'm trying to use unified logging to log messages and retrieve them from a device for inspection. I created a new OSLog instance withlet logDebug = OSLog(subsystem: Bundle.main.bundleIdentifier!, category: " debug")Then I log messages withos_log("%@", log: logDebug, type: OSLogType.debug, String(format:message, arguments: formatArgs))This works fine in the Xcode console. It's supposed to be possible to retrieve these logs from an Xcode device byDoing a sysdiagnose (as Apple explains here)Transferring the sysdiagnose to a Mac, and thenInspecting the log (in system_logs.logarchive) with Console.app or the log command line tool.When I do this, none of my logs are shown. No logs with the expected subsystem (bundle ID) are shown.However, if I examine the log files in system_logs.logarchive directly via grep or vim, the messages are present. So they're in the logs, but for some reason neither Console.app nor log will show them. I made sure that Console.app is set to show "all messages" and I haven't entered any search terms.What step am I missing, or what detail needs to be different?
NSManagedObjectContext has a method called refreshAllObjects() but it's not clear what effect it's supposed to have. The docs say nothing, and header comments include a line that says it "calls -refreshObject:mergeChanges: on all currently registered objects with this context".OK. But. When calling refresh(NSManagedObject, mergeChanges: Bool), the second argument matters. Passing "true" has a different effect than passing "false". Which of these does refreshAllObjects() use?
Post not yet marked as solved
I'm trying to get my app to show up as a messaging option in the Contacts app, as described in WWDC 2016 session 240: https://developer.apple.com/videos/play/wwdc2016/240/?time=1072What I'm aiming for is something like WhatsApp, which was discussed in the WWDC session. The code presented doesn't duplicate WhatsApp's Contact integration though. It only integrates my app with contacts specifically named as recipients in my INSendMessageIntent.The code to create and donate an interaction looks like let activity = NSUserActivity(activityType: "com.example.message")
activity.title = "Send CB Test Message"
activity.expirationDate = Date.distantFuture
let recipient = INPerson( /* recipient with an email address in my Contacts database */ )
let sender = INPerson( /* me */ )
let intent = INSendMessageIntent(recipients: [recipient], content: nil, groupName: nil, serviceName: "CB Test Chat", sender: sender)
let response = INSendMessageIntentResponse(code: .success, userActivity: activity)
let interaction = INInteraction(intent: intent, response: response)
interaction.direction = .outgoing
interaction.donate { (error) in
print("Donated")
if let error = error {
print("Donate error: \(error)")
}
}This succeeds and my app shows up in the recipient's contact card. Fine.But I notice that WhatsApp options appear on all contact cards, including people who don't have WhatsApp accounts. How does that work? I tried passing an empty array or nil as the recipient list, but that had no effect. In fact, if I create a new contact with bogus information, WhatsApp still appears as a phone and video option.How does that work? I had thought maybe WhatsApp was scanning my contacts and donating an interaction for everyone. But it's not even running, and I still see them as an option for newly created contact entries.