Will it not be great to port RealityKit to the iPad Pro with M1 and capture objects on it?
I think there is no info related to it, but, am I missing something? Is there a plan to move it to iOS soon?
Best!
Hello,
I'm developing an app that plays a short audio when you push a button. When I'm testing VoiceOver and "push" the button, the voice says "Play button, playing..." over the audio I'm playing, and it is impossible to listen to it.
What do you think is the best approach?
I would like to add a delay to my audio, but I could not see a callback when the voiceOver said the sentence.
Is it correct to remove the accessibility label for this kind of buttons (play/pause/stop)?
Thanks!
Post not yet marked as solved
Hello,
In the last release, the "Principal Class" is not working anymore. I'm using an own UIApplication class to implement a "Kiosk mode" in my app (after five minutes with no events, it comes back to the root view).
In the new release, the SwiftUI life cycle is not using my UIApplication class.
Is there a trick to do it?
Thanks!
Post not yet marked as solved
Hello,
The NavigationView just changed and I'm a bit lost. Could you tell me the correct way to show the title with a background (material) if the content is a map?
Thanks!!!
import SwiftUI
struct ExampleStationListView_Previews: PreviewProvider {
static var previews: some View {
NavigationView {
Map(mapRect: .constant(MKMapRect.world))
.ignoresSafeArea()
.navigationBarTitle("Select Station",
displayMode: .inline)
}
}
}
Post not yet marked as solved
Hello!
I have this code on my main app:
.onAppear {
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 0.0
UIColor(Color.accentColor).getRed(&r, green: &g, blue: &b, alpha: &a)
print("\(r), \(g), \(b), \(a)")
}
If I'm using "Color.red", the output is:
0.9999999403953552, 0.23137253522872925, 0.18823528289794922, 1.0
But, if I'm using Color.accentColor:
0.0, 0.4784314036369324, 0.9999999403953552, 1.0
Instead of:
0.3059999942779541, 0.16499999165534973, 0.5180000066757202, 1.0
Which is the one set on the Assets.xcassets.
It is not only Color.accentColor, but all the colors defined at the xcassets. For example, a color with any/dark appearances will always return the any components.
Details:
https://github.com/heltena/SwiftUI-GetComponents
Thanks!
Post not yet marked as solved
Hello all!
I’m working on Earthtunes, an app to listen to the Solid Earth. Ad: It is available in the App Store is you wanna take a look (I’ll be happy to hear from you!)
The point is that this app generates a video to send a sound (not only earthquake, but a cut of the sound of a seismometer) by text, whatsapp…
...and…
sometimes, it raises an error saying that it could not send the video. I could not find the reason of it. I’m posting here [1] a simplified project that shows this error and I would like to ask you if you know what it is going on there.
It seems related to the video, but I'm always using the same codec and sometimes works, sometimes does not.
Thanks a lot!!!
[1] https://github.com/heltena/ExportTunes
Hi all,I'm looking for an algorithm to generate random numbers in a kernel shader function, similar to "curand" for Cuda, but I couldn't find it. Is there some interesting library?Thank you so much,
Hello,
I'm trying to play some waves I'm downloading from a seismometer and the sound is not good.
I decided to create a simple wave (C5 note, 523.25 Hz) and play it and it does not work too.
Here is my code:
import AVFoundation
import Combine
class ContentViewModel: ObservableObject {
		let audioEngine: AVAudioEngine
		let player: AVAudioPlayerNode
		
		let data: [Double]
		let sampleRate: Double
		init() {
				let sinFrequency: Double = 523.25	/* C5 */
				let sampleRate: Double = 44100
				let seconds: Double = 5
				
				let range = 0 ..< Int(seconds * sampleRate)
				self.data = range.map { sin(2.0 * .pi * Double($0) * sinFrequency / sampleRate) }
				self.sampleRate = sampleRate
				
				audioEngine = AVAudioEngine()
				let _ = audioEngine.mainMixerNode
				audioEngine.prepare()
				try! audioEngine.start()
				try! AVAudioSession.sharedInstance().setCategory(.playback)
				
				self.player = AVAudioPlayerNode()
				audioEngine.attach(player)
		}
		func copyBuffer<T: FixedWidthInteger>(data: [Double], buffer: AVAudioPCMBuffer, channelData: UnsafePointer<UnsafeMutablePointer<T>>) {
				buffer.frameLength = buffer.frameCapacity
				let buffData = data.map { T(Double(T.max) * $0) }
				memcpy(channelData[0], buffData, Int(buffer.frameCapacity) * MemoryLayout<T>.size)
		}
		enum BufferType {
				case int16
				case int32
		}
		
		func createBuffer(for type: BufferType) -> AVAudioPCMBuffer {
				switch type {
				case .int16:
						guard
								let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false),
								let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)),
								let channelData = buffer.int16ChannelData
						else {
								fatalError()
						}
						copyBuffer(data: data, buffer: buffer, channelData: channelData)
						return buffer
				case .int32:
						guard
								let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false),
								let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)),
								let channelData = buffer.int16ChannelData
						else {
								fatalError()
						}
						copyBuffer(data: data, buffer: buffer, channelData: channelData)
						return buffer
				}
		}
		
		func play(for type: BufferType) {
				let buffer = createBuffer(for: type)
				
				let linkFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1)
				audioEngine.connect(player, to: audioEngine.mainMixerNode, format: linkFormat)
				audioEngine.prepare()
				audioEngine.mainMixerNode.outputVolume = 0.5
				player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil)
				if !player.isPlaying {
						player.play()
				}
		}
}
You can listen to the note looking for "Middle C Sine Wave for Ten Hours - 261.6 hertz" on YouTube (the title is wrong, this video is for C5).
Could you please tell me why my sound does not sound like the real C5 note?
Thanks!!!
You can create a simple ContentView in Swift with this code:
import SwiftUI
struct ContentView: View {
		@StateObject var viewModel = ContentViewModel()
		
		var body: some View {
				VStack {
						Spacer()
						HStack {
								Button("Play Int16") {
										viewModel.play(for: .int16)
								}
								Button("Play Int32") {
										viewModel.play(for: .int32)
								}
						}
						Spacer()
				}
		}
}