Posts

Post marked as solved
1 Replies
166 Views
Will it not be great to port RealityKit to the iPad Pro with M1 and capture objects on it? I think there is no info related to it, but, am I missing something? Is there a plan to move it to iOS soon? Best!
Posted
by heltena.
Last updated
.
Post marked as solved
2 Replies
219 Views
Hello, I'm developing an app that plays a short audio when you push a button. When I'm testing VoiceOver and "push" the button, the voice says "Play button, playing..." over the audio I'm playing, and it is impossible to listen to it. What do you think is the best approach? I would like to add a delay to my audio, but I could not see a callback when the voiceOver said the sentence. Is it correct to remove the accessibility label for this kind of buttons (play/pause/stop)? Thanks!
Posted
by heltena.
Last updated
.
Post not yet marked as solved
0 Replies
179 Views
Hello, In the last release, the "Principal Class" is not working anymore. I'm using an own UIApplication class to implement a "Kiosk mode" in my app (after five minutes with no events, it comes back to the root view). In the new release, the SwiftUI life cycle is not using my UIApplication class. Is there a trick to do it? Thanks!
Posted
by heltena.
Last updated
.
Post not yet marked as solved
0 Replies
243 Views
Hello, The NavigationView just changed and I'm a bit lost. Could you tell me the correct way to show the title with a background (material) if the content is a map? Thanks!!! import SwiftUI struct ExampleStationListView_Previews: PreviewProvider { static var previews: some View { NavigationView { Map(mapRect: .constant(MKMapRect.world)) .ignoresSafeArea() .navigationBarTitle("Select Station", displayMode: .inline) } } }
Posted
by heltena.
Last updated
.
Post not yet marked as solved
0 Replies
403 Views
Hello! I have this code on my main app: .onAppear { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 UIColor(Color.accentColor).getRed(&r, green: &g, blue: &b, alpha: &a) print("\(r), \(g), \(b), \(a)") } If I'm using "Color.red", the output is: 0.9999999403953552, 0.23137253522872925, 0.18823528289794922, 1.0 But, if I'm using Color.accentColor: 0.0, 0.4784314036369324, 0.9999999403953552, 1.0 Instead of: 0.3059999942779541, 0.16499999165534973, 0.5180000066757202, 1.0 Which is the one set on the Assets.xcassets. It is not only Color.accentColor, but all the colors defined at the xcassets. For example, a color with any/dark appearances will always return the any components. Details: https://github.com/heltena/SwiftUI-GetComponents Thanks!
Posted
by heltena.
Last updated
.
Post not yet marked as solved
2 Replies
586 Views
Hello all! I’m working on Earthtunes, an app to listen to the Solid Earth. Ad: It is available in the App Store is you wanna take a look (I’ll be happy to hear from you!) The point is that this app generates a video to send a sound (not only earthquake, but a cut of the sound of a seismometer) by text, whatsapp… ...and… sometimes, it raises an error saying that it could not send the video. I could not find the reason of it. I’m posting here [1] a simplified project that shows this error and I would like to ask you if you know what it is going on there. It seems related to the video, but I'm always using the same codec and sometimes works, sometimes does not. Thanks a lot!!! [1] https://github.com/heltena/ExportTunes
Posted
by heltena.
Last updated
.
Post marked as solved
4 Replies
1.8k Views
Hi all,I'm looking for an algorithm to generate random numbers in a kernel shader function, similar to "curand" for Cuda, but I couldn't find it. Is there some interesting library?Thank you so much,
Posted
by heltena.
Last updated
.
Post marked as solved
1 Replies
419 Views
Hello, I'm trying to play some waves I'm downloading from a seismometer and the sound is not good. I decided to create a simple wave (C5 note, 523.25 Hz) and play it and it does not work too. Here is my code: import AVFoundation import Combine class ContentViewModel: ObservableObject { &#9;&#9;let audioEngine: AVAudioEngine &#9;&#9;let player: AVAudioPlayerNode &#9;&#9; &#9;&#9;let data: [Double] &#9;&#9;let sampleRate: Double &#9;&#9;init() { &#9;&#9;&#9;&#9;let sinFrequency: Double = 523.25&#9;/* C5 */ &#9;&#9;&#9;&#9;let sampleRate: Double = 44100 &#9;&#9;&#9;&#9;let seconds: Double = 5 &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let range = 0 ..< Int(seconds * sampleRate) &#9;&#9;&#9;&#9;self.data = range.map { sin(2.0 * .pi * Double($0) * sinFrequency / sampleRate) } &#9;&#9;&#9;&#9;self.sampleRate = sampleRate &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;audioEngine = AVAudioEngine() &#9;&#9;&#9;&#9;let _ = audioEngine.mainMixerNode &#9;&#9;&#9;&#9;audioEngine.prepare() &#9;&#9;&#9;&#9;try! audioEngine.start() &#9;&#9;&#9;&#9;try! AVAudioSession.sharedInstance().setCategory(.playback) &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;self.player = AVAudioPlayerNode() &#9;&#9;&#9;&#9;audioEngine.attach(player) &#9;&#9;} &#9;&#9;func copyBuffer<T: FixedWidthInteger>(data: [Double], buffer: AVAudioPCMBuffer, channelData: UnsafePointer<UnsafeMutablePointer<T>>) { &#9;&#9;&#9;&#9;buffer.frameLength = buffer.frameCapacity &#9;&#9;&#9;&#9;let buffData = data.map { T(Double(T.max) * $0) } &#9;&#9;&#9;&#9;memcpy(channelData[0], buffData, Int(buffer.frameCapacity) * MemoryLayout<T>.size) &#9;&#9;} &#9;&#9;enum BufferType { &#9;&#9;&#9;&#9;case int16 &#9;&#9;&#9;&#9;case int32 &#9;&#9;} &#9;&#9; &#9;&#9;func createBuffer(for type: BufferType) -> AVAudioPCMBuffer { &#9;&#9;&#9;&#9;switch type { &#9;&#9;&#9;&#9;case .int16: &#9;&#9;&#9;&#9;&#9;&#9;guard &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let channelData = buffer.int16ChannelData &#9;&#9;&#9;&#9;&#9;&#9;else { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;fatalError() &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;copyBuffer(data: data, buffer: buffer, channelData: channelData) &#9;&#9;&#9;&#9;&#9;&#9;return buffer &#9;&#9;&#9;&#9;case .int32: &#9;&#9;&#9;&#9;&#9;&#9;guard &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let channelData = buffer.int16ChannelData &#9;&#9;&#9;&#9;&#9;&#9;else { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;fatalError() &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;copyBuffer(data: data, buffer: buffer, channelData: channelData) &#9;&#9;&#9;&#9;&#9;&#9;return buffer &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;func play(for type: BufferType) { &#9;&#9;&#9;&#9;let buffer = createBuffer(for: type) &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let linkFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1) &#9;&#9;&#9;&#9;audioEngine.connect(player, to: audioEngine.mainMixerNode, format: linkFormat) &#9;&#9;&#9;&#9;audioEngine.prepare() &#9;&#9;&#9;&#9;audioEngine.mainMixerNode.outputVolume = 0.5 &#9;&#9;&#9;&#9;player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil) &#9;&#9;&#9;&#9;if !player.isPlaying { &#9;&#9;&#9;&#9;&#9;&#9;player.play() &#9;&#9;&#9;&#9;} &#9;&#9;} } You can listen to the note looking for "Middle C Sine Wave for Ten Hours - 261.6 hertz" on YouTube (the title is wrong, this video is for C5). Could you please tell me why my sound does not sound like the real C5 note? Thanks!!! You can create a simple ContentView in Swift with this code: import SwiftUI struct ContentView: View { &#9;&#9;@StateObject var viewModel = ContentViewModel() &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;VStack { &#9;&#9;&#9;&#9;&#9;&#9;Spacer() &#9;&#9;&#9;&#9;&#9;&#9;HStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Button("Play Int16") { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;viewModel.play(for: .int16) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Button("Play Int32") { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;viewModel.play(for: .int32) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;Spacer() &#9;&#9;&#9;&#9;} &#9;&#9;} }
Posted
by heltena.
Last updated
.