Hi LGariv
Sorry for the delay in getting back to you but you do indeed have a conversion error in your project. I have updated the convertBuffer
function from your project and I am now getting matches.
Firstly you had an assumption about the bytes per frame which was incorrect. I fixed this by reading from the asbd
let frameLength = AVAudioFrameCount(mBuffers.mDataByteSize / **asbd.mBytesPerFrame**)
Then you need to copy the memory to your output buffer, the full method is below. Please use this only as a starting point, it does not handle channels correctly and should only be used as a guide.
| func convertBuffer(sampleBuffer: CMSampleBuffer) -> AVAudioPCMBuffer { |
| |
| var asbd = CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription! |
| |
| var audioBufferList = AudioBufferList() |
| |
| var blockBuffer : CMBlockBuffer? |
| |
| |
| |
| CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer( |
| |
| sampleBuffer, |
| |
| bufferListSizeNeededOut: nil, |
| |
| bufferListOut: &audioBufferList, |
| |
| bufferListSize: MemoryLayout<AudioBufferList>.size, |
| |
| blockBufferAllocator: nil, |
| |
| blockBufferMemoryAllocator: nil, |
| |
| flags: 0, |
| |
| blockBufferOut: &blockBuffer |
| |
| ) |
| |
| |
| |
| let mBuffers = audioBufferList.mBuffers |
| |
| let frameLength = AVAudioFrameCount(mBuffers.mDataByteSize / asbd.mBytesPerFrame) |
| |
| let pcmBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat(streamDescription: &asbd)!, frameCapacity: frameLength)! |
| |
| pcmBuffer.frameLength = frameLength |
| |
| |
| |
| memcpy(pcmBuffer.mutableAudioBufferList.pointee.mBuffers.mData, audioBufferList.mBuffers.mData, Int(audioBufferList.mBuffers.mDataByteSize)) |
| |
| pcmBuffer.mutableAudioBufferList.pointee.mBuffers.mDataByteSize = audioBufferList.mBuffers.mDataByteSize |
| |
| |
| |
| return pcmBuffer |
| |
| } |
Lastly, cool app, keep up the good work!