I just converted my App to Freemium, but the method is used to give user, that bought the App before it was free, access to the paid futures seems not to be working.
I am getting the original Purchase Date from the AppTransaction (https://developer.apple.com/documentation/storekit/apptransaction/originalpurchasedate) and comparing that to a set date where the App Model changed. In Test Flight this is working without any Problems but on the Live System, users that have purchased the App do not get access!
let shared = try await AppTransaction.refresh()
if case .verified(let appTransaction) = shared {
result(appTransaction.originalPurchaseDate.ISO8601Format()).
}
I am using flutter to develop the App and the result()... send the string back to the flutter side.
Here is the code of the Flutter side:
Future<void> restorePurchases() async {
SubscriptionProvider().updatePayment(PurchaseStatus.pending);
await InAppPurchase.instance.restorePurchases();
if (Platform.isIOS) {
DateTime changedToFreemium = DateTime.utc(2025, 4, 7, 11, 0, 0);
String? purchaseDateRaw = await IosFlutterChannel().getOriginalPurchaseDate();
if (kDebugMode) {
print("Purchase Date Raw: $purchaseDateRaw");
}
if (purchaseDateRaw != null) {
DateTime purchaseDate = DateTime.parse(purchaseDateRaw);
if (purchaseDate.isBefore(changedToFreemium)) {
if (kDebugMode) {
print("Restoring legacy purchases");
}
SubscriptionProvider().update(true, SubscriptionStatus.active, SubscriptionType.fs_lifetime);
} else {
if (kDebugMode) {
print("Not restoring legacy purchases");
}
}
}
}
SubscriptionProvider().updatePayment(PurchaseStatus.purchased);
}
Console Log when running in Test Environment:
flutter: Purchase Date Raw: 2013-08-01T07:00:00Z
flutter: Restoring legacy purchases
Thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I am looking for a good way to play sounds at a high frequency.
At the moment I am using the AVAudioEngine, and create a couple AVAudioPlayerNode and for each sound I need to play I create a AVAudioPCMBuffer.
When the app needs to play a sound, I get the correct AVAudioPCMBuffer for the sound and use the first available AVAudioPlayerNode and feed it to the buffer.
The timing for a metronome app has to be very precise because if it's of by about 16ms the user can hear that it is not playing had the right interval. For low speeds this is working without any problems, but at high speeds it is getting worse.
Maybe anyone has an idea on how I can improve my method.
Its a Plugin for Flutter.
import AVFoundation
class FastSoundPlayer {
private var audioPlayers: [SoundPlayer?] = []
private var sounds: [String: Sound] = [:]
private var engine = AVAudioEngine()
let session = AVAudioSession.sharedInstance()
init() {
do {
try session.setCategory(AVAudioSession.Category.playback, mode: AVAudioSession.Mode.default, options: [AVAudioSession.CategoryOptions.mixWithOthers])
try session.setActive(true)
createSoundPlayers(count: 20)
try engine.start()
} catch {
print("Error starting audio engine: \(error.localizedDescription)")
}
}
// Selector method to handle applicationDidBecomeActiveNotification
func applicationDidBecomeActive() {
// Reinitialize AVAudioEngine and reattach all nodes
do {
engine.reset()
objc_sync_enter(audioPlayers)
audioPlayers.removeAll()
createSoundPlayers(count: 20)
objc_sync_exit(audioPlayers)
try engine.start()
} catch {
print("Error starting audio engine: \(error.localizedDescription)")
}
}
func createSoundPlayers(count: Int) {
for _ in 0..<count {
let player = SoundPlayer()
engine.attach(player.player)
engine.connect(player.player, to: engine.mainMixerNode, format: nil)
audioPlayers.append(player)
}
}
func load(sound: Data, name: String) {
let sound = Sound(soundData: sound)
sounds[name] = sound
}
func play(name: String) {
if !engine.isRunning {
applicationDidBecomeActive()
}
guard let sound = sounds[name] else {
print("Sound not found")
return
}
if let player = getAvailablePlayer() {
player.play(sound: sound)
}
}
func getAvailablePlayer() -> SoundPlayer? {
for player in audioPlayers {
if !player!.isPlaying {
return player
}
}
return nil
}
}
class SoundPlayer {
let player = AVAudioPlayerNode()
var isPlaying = false
init() {
player.volume = 1.0
}
func play(sound: Sound) {
player.scheduleBuffer(sound.sound!, at: nil, options: .interrupts, completionCallbackType: .dataPlayedBack) { _ in
self.complete()
}
if (player.engine != nil && player.engine!.isRunning) {
player.play()
isPlaying = true
}
}
func complete() {
isPlaying = false
}
}
class Sound {
var sound: AVAudioPCMBuffer?
init(soundData: Data) {
do {
let temporaryURL = FileManager.default.temporaryDirectory.appendingPathComponent("tempSound.wav")
try soundData.write(to: temporaryURL)
// Create AVAudioFile from the temporary file URL
let audioFile = try AVAudioFile(forReading: temporaryURL)
// Define the format for the PCM buffer (44100Hz, stereo)
let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100, channels: 2, interleaved: false)
// Create AVAudioPCMBuffer
guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format!, frameCapacity: AVAudioFrameCount(audioFile.length)) else {
// Failed to create PCM buffer
self.sound = nil
return
}
// Read audio file into PCM buffer
try audioFile.read(into: pcmBuffer)
// Assign the created AVAudioPCMBuffer to the sound property
self.sound = pcmBuffer
} catch {
print("Error loading sound file: \(error.localizedDescription)")
self.sound = nil
}
}
}
Thanks!