Dive into the world of programming languages used for app development.

All subtopics

Post

Replies

Boosts

Views

Activity

IndexedDB in WebView, get deleted?
Hi. I plan to use a WebView in an iOS app (SWIFT) and this should run a web app with WASM and using IndexedDB for permanent credentials. I found rumors and information on Apple deleting data in IndexedDB and localStorage after 7 days (see links below). But I found no official information that tells me if this is true for my WebView in my ordinary mobile App (not PWA). A test cycle over a week to find out is hard to do... Is there any reliable and clear information on this and am I affected? Thank you! . Links about this topic: https://news.ycombinator.com/item?id=28158407 https://www.reddit.com/r/javascript/comments/foqxp9/webkit_will_delete_all_local_storage_including/ https://searchengineland.com/what-safaris-7-day-cap-on-script-writeable-storage-means-for-pwa-developers-332519
0
1
396
Apr ’24
Complete Beginner - Any help is greatly appreciated
Hello everyone, So I will start off by saying I am a very amateur developer with some experience in C++ mostly. Over the summer I want to build an app similar to a board game and launch it on the App Store for me and my friends to play when we don't have the game's physical board. Basically, there would be one person who hosts a "game" while everyone else joins through a code or something like that (maybe there's an easier way if you know everyone would be playing in person with each other). Once a game begins I want cards to show up on peoples's screens and that's it, no fancy graphics or anything like that. So, to the root of my issue. I am brand new to Swift and Xcode. I began googling and tinkering with it and made a little app where a user can add names and then pick letters from the names to display (very very basic stuff). I also figured out how to import and manipulate images a little bit. My question is about the process of making a game, connecting it to GameKit/Game Center, and then how to actually launch it on the App Store so my friends can also download it. If anyone has any resources they particularly found useful when starting out using Swift, please let me know. I really really don't like reading straight from the documentation (although who does honestly). Anything helps!! Thank you!
1
0
301
Apr ’24
Superclass - subclass - init(from: decoder Decoder) - how to
Hello together, i want to us some classes to manage informations, which get fetched from a firestore database. My idea was, that I will have different classes for the different state of informations. The information which will be common for the different states should have the same properties. Therefore it made sense for me to have a super class which stores the main informations and derive a subclass with extra properties to store the more informations. My question is, how to define the initializer method properly, so that I can store these data informations fetched from firestore at once without any loss. Superclass (I reduced it to a minimum, just to show my principal problem): class GameInfo: Codable, Identifiable { @DocumentID var id: String? // -> UUID of firestore document var league: String var homeTeam: String var guestTeam: String enum CodingKeys: String, CodingKey { case league case homeTeam case guestTeam } init(league: String, homeTeam: String, guestTeam: String) { self.league = league self.homeTeam = homeTeam self.guestTeam = guestTeam } } the subclass should contain the GameInfo Properties and some others ... class Game: GameInfo { var startTime: Date? enum CodingKeys: String, CodingKey { case startTime } init(league: String, homeTeam: String, guestTeam: String, startTime: Date) { self.startTime = startTime super.init(league: league, homeTeam: homeTeam, guestTeam: guestTeam) } required init(from decoder: any Decoder) throws { let values = try decoder.container(keyedBy: CodingKeys.self) self.startTime = try values.decodeIfPresent(Date.self, forKey: .startTime) super.init(league: "", homeTeam: "",, guestTeam: "") } With the required init-method, informations get decoded and stored. (the data from firestore contain the league, homeTeam, guestTeam and startTime informations). The super.init() method as defined results in empty strings. But what I want is, that the league, homeTeam and guestTeam values will also be decoded from the firestore informations. But I don't know how. If I use the code super.init(league: league, homeTeam: homeTeam, guestTeam: guestTeam) within the required init() than I get the compiler error message 'self' used in property access 'league' before 'super.init' call What is wrong in my thinking ? Any help appreciated. Thanks and best regards Peter
1
0
199
Apr ’24
Apple Sign In "Sign-Up not completed"
Hi We getting error in Apple Sign In "Sign-Up not completed", Apple sign in working fine for old Apps and old Bundle ids, But it's not working in new Apps and new Bundle ids We checked with other Apple Developer team accounts Apple Sign In is working on the same source code. But my Team account is getting an error. We enabled signing capabilities and added Sign in with Apple and we added Provisioning profile certificate also , but I am still getting the same error.
0
0
249
Apr ’24
CoreBluetooth's connectionEventDidOccur is never fired in background
Hi, I'm trying to build an app to connect to both BR/EDR ("Classic") or BLE devices. For Classic, the recommended flow in Apple docs is: Start your app and initialize your CBCentralManager Pair the phone and the device manually through settings This should automatically call the connectionEventDidOccur From then on it depends on the dev to choose what to do with the information from the callback (peripheral, event, etc), but the callback is simply never fired. Here's my basic implementation of the callback: func centralManager(_ central: CBCentralManager, connectionEventDidOccur event: CBConnectionEvent, for peripheral: CBPeripheral) { print("IOS: connectionEventDidOccur for peripheral: %@", peripheral) if (event == .peerConnected) { print("IOS: Case is peer connected") connectToPeripheral(peripheral.identifier.uuidString) if(!savedPeripheralList.contains(where: { $0.identifier == peripheral.identifier })){ savedPeripheralList.append(peripheral) } } else if (event == .peerDisconnected ){ print("IOS: Peer %@ disconnected!", peripheral) } else { print("IOS: if statement didn't work") } } I'm essentially: Printing the fact that the callback was called Trying to connect the app to the peripheral (by now only the phone is connected) Saving this newly connected peripheral to a local list For context, I've been able to scan and connect to BLE devices like earbuds normally, so my general implementation of other callbacks like didDiscover or didConnect works just fine, this is the only non functional callback. Any ideas would be appreciated, thanks!
0
1
257
Apr ’24
Why is the text not complying with contour tag?
I got this SSML from w3. org. AVSpeechUtterance(ssmlRepresentation:) is not complying with the contour. It doesn't change hz. <?xml version="1.0"?> <speak version="1.1" xmlns="http://www.w3.org/2001/10/synthesis" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.w3.org/2001/10/synthesis http://www.w3.org/TR/speech-synthesis11/synthesis.xsd" xml:lang="en-US"> <prosody contour="(0%,+20Hz) (10%,+30%) (40%,+10Hz)"> good morning </prosody> </speak> override func viewDidLoad() { super.viewDidLoad() guard let localUtterance = AVSpeechUtterance(ssmlRepresentation: self.speechSML) else { print("SML did not work.") return } self.utterance = localUtterance self.utterance.voice = self.voiceNoelle } self.synthesizer.speak(self.utterance)
1
0
335
Apr ’24
Testing In App purchases in Sandbox environment
I have been trying to enable in app purchase in sandbox environment but instead it keeps performing transactions in Xcode. Also Apple says in its docs to set up sandbox tester account in Settings &gt; App store without signing out from the non sandbox account, and says the sign in will appear as soon as we perform a transaction on device, but it does not worked for me I have ensured and tried following things: Created sand box tester id Storekit configured , all the plans fetched Has development team Tried signing in Sandbox account Checked whether I am testing in production or development build by if else (It was in development) Tested on real device Please help me get through this
0
0
281
Apr ’24
NWPathMonitor and mobile router WiFi without SIM always get connected
Hello, I'm looking for a way to detect using NWPathMonitor when the iOS device is connected to a router but not to the internet. As an example a mobile router WiFi without SIM. In settings I'm able to switch the connection to its WiFi, once connected a label below the SSID shows Not connected to the internet. I would like to show the same thing to the user inside my app, but unfortunately I always get the satisfied answer. Am I missing something in configuring NWPathMonitor or reading the answer? final class InternetConnectionMonitor { lazy var internetConnectionStatusPublisher: AnyPublisher&lt;InternetConnectionStatus, Never&gt; = { _internetConnectionStatusSubject .compactMap{ $0 } .eraseToAnyPublisher() }() var lastInternetConnectionStatus: InternetConnectionStatus? { _internetConnectionStatusSubject.value } private let _internetConnectionStatusSubject = CurrentValueSubject&lt;InternetConnectionStatus?, Never&gt;(nil) private let pathMonitor = NWPathMonitor() private let pathMonitorQueue = DispatchQueue(label: "com.xxxxx-network-monitor", qos: .default) init() { startPathMonitoring() } private func startPathMonitoring() { pathMonitor.pathUpdateHandler = { [weak self] path in guard let self else { return } let networkStatus = InternetConnectionStatus(from: path) self._internetConnectionStatusSubject.send(networkStatus) } pathMonitor.start(queue: pathMonitorQueue) } }
9
0
489
Apr ’24
Add new Labels to MLImageClassifier of existing Checkpoint/Session
Hey, i just created and trained an MLImageClassifier via the MLImageclassifier.train() method (https://developer.apple.com/documentation/createml/mlimageclassifier/train(trainingdata:parameters:sessionparameters:)) For my Trainingdata (MLImageclassifier.DataSource) i am using my directoy structure, so i got an images folder with subfolders of person1, person2, person3 etc. which contain images of the labeled persons (https://developer.apple.com/documentation/createml/mlimageclassifier/datasource/labeleddirectories(at:)) I am saving the checkpoints and sessions in my appdirectory, so i can create an MLIMageClassifier from an exisiting MLSession and/or MLCheckpoint. My question is: is there any way to add new labels, optimally from my directoy strucutre, to an MLImageClassifier which i create from an existing MLCheckpoint/MLSession? So like adding a person4 and training my pretrained Classifier with only that person4. Or is it simply not possible and i have to train from the beginning everytime i want to add a new label? Unfortunately i cannot find anything in the API. Thanks!
0
0
456
Apr ’24
iOS 17 UIImageReader has memory leaks
In my SwiftUI view, I try to load the image from data. var body: some View { Group{ if let data = model.detailImageData, let uiimage = UIImage(data: data) {// no memory issue Image(uiImage: uiimage) .resizable() .scaledToFit() } } } But I want to get the HDR style of my image, so I use if let data = model.detailImageData, let uiimage = UIImageReader.default.image(data:data){ //memory leaks!!! When I change the data, the memory of the previous image is never freeed. finally caused my app to crash. You can see it from the Instrument screenshot.
1
1
413
Apr ’24
Memory Leak in ImageIO?
I use this code to show the Image in HDR in SwiftUI struct HDRImageView: UIViewRepresentable { // Set up a common reader for all UIImage read requests. static let reader: UIImageReader = { var config = UIImageReader.Configuration() config.prefersHighDynamicRange = true return UIImageReader(configuration: config) }() let data:Data? let enableHDR:Bool func makeUIView(context: Context) -> UIImageView { let view = UIImageView() view.preferredImageDynamicRange = enableHDR ? .high : .standard update(view) // Set this view to fit itself to the parent view. view.setContentCompressionResistancePriority(.defaultLow, for: .horizontal) view.setContentCompressionResistancePriority(.defaultLow, for: .vertical) view.setContentHuggingPriority(.required, for: .horizontal) view.setContentHuggingPriority(.required, for: .vertical) return view } func updateUIView(_ view: UIImageView, context: Context) { update(view) } func update(_ view: UIImageView) { autoreleasepool{//not working if let data = data { view.image = nil//set to nil first is not working view.image = HDRImageView.reader.image(data: data) } else { view.image = nil } view.preferredImageDynamicRange = enableHDR ? .high : .standard } } } But when I update the input data, seems that the old image data can not be freeed. After several changes, the app takes too much memory and crash. I found it's the VM:ImageIO_Surface_Data and the VM_Image_IO take up the memory. If I change the HDRImageView into a normal Image(uiimage:UIImage(data:)) It no longer have this issue. Is it a memory leak? and how to solve this. Update: I then tried using Image(_:cgImage), and it appear to be the same result.
0
0
394
Apr ’24
Implementing A Direct Link to App Settings
Howdy, I have a ***** feeling that the answer to my question is "Y'all cain't do that!", but I figure I'll ask, anyway. THE SAD STORY (GET YOUR HANKY): We have an app that implements Sign [up|in] with Apple. It does it pretty well, with no password visible to the user, and a pretty smooth UX. The issue is what happens when users bork their install. We don't think it will happen often, but want to be able to give the user the best way out, if possible. With the regular (non-SiiA) method, they bonk on a "Forgot Password" button, and the app sends them a new password. We can't do that, with SiiA. The password is stored in the app (in the keychain, so it's very persistent, and shared across devices), and it would a Very Bad Security Hole, to allow users to simply send a new password to the server (the other method generates a rando in the server), which is what would happen, with our method of handling the password. It would also be equally bad, if the server could simply send a new password to the user, directly to their device (the other method sends an email, based on the sign-in information on the server). So the user needs to delete their keychain data completely, which we can easily do, but that does not deal with their SiiA stuff, stored on Apple's server. This is what Apple tells us to do, to delete that. WHICH BEGS THE QUESTION: My question is: Is there a URL scheme that I can use to directly open that panel? If so, it would allow us to create a screen that helps the user to do all the deletions (on the device, our server, and the Apple server).
3
0
626
Apr ’24
Doubts about running new syntax in an older version of Swift
I have an app that uses Swift 4.2. I am going to update the Swift Compiler Language to Swift 5. But I have some confusion after my test. I was able to run async/await code when using Xcode 15 with the version set to Swift 4.2, and it was okay. But async/await code was developed in Swift5.5. Why it is okay? Is it related to the Xocde Swift version? And if I use Swift Compiler version 4.2 to run async/await code,what problems might there be ?
1
0
380
Apr ’24
Dispatch queue: com.apple.NSURLSession-delegate after executing self written shortcut
Currently I am trying to create some shortcuts for my iOS 17 app. The AppIntent looks like this: class PostClass{ public init() { let url = URL(string: "http://myurl")! var request = URLRequest(url: url) request.httpMethod = "POST" struct Message: Encodable { let device_type: String } let message = Message( device_type: "device" ) let data = try! JSONEncoder().encode(message) request.httpBody = data request.setValue( "application/json", forHTTPHeaderField: "Content-Type" ) self.request = request } } var activateDevice = PostClass() @available(iOS 17, *) struct ActivateIntent: AppIntent { static let title: LocalizedStringResource = "Activate" func perform() async throws -> some IntentResult { let task = URLSession.shared.dataTask(with: activateDevice.request) { data, response, error in let statusCode = (response as! HTTPURLResponse).statusCode if statusCode == 200 { print("SUCCESS") } else { print("FAILURE") } } task.resume() return .result() } } Unfortunately, the shortcut throws an error after every second execution. I looked into the ips-file and saw the following traceback: Thread 2 Crashed: 0 Runner 0x1028ddd30 closure #1 in ActivateIntent.perform() + 388 1 CFNetwork 0x1a6f39248 0x1a6f2a000 + 62024 2 CFNetwork 0x1a6f57210 0x1a6f2a000 + 184848 3 libdispatch.dylib 0x1adced13c _dispatch_call_block_and_release + 32 4 libdispatch.dylib 0x1adceedd4 _dispatch_client_callout + 20 5 libdispatch.dylib 0x1adcf6400 _dispatch_lane_serial_drain + 748 6 libdispatch.dylib 0x1adcf6f64 _dispatch_lane_invoke + 432 7 libdispatch.dylib 0x1add01cb4 _dispatch_root_queue_drain_deferred_wlh + 288 8 libdispatch.dylib 0x1add01528 _dispatch_workloop_worker_thread + 404 9 libsystem_pthread.dylib 0x201dd4f20 _pthread_wqthread + 288 10 libsystem_pthread.dylib 0x201dd4fc0 start_wqthread + 8 Is there any way to locate the error with these information? Has it something to do with the fact that my code is not thread-safe? Or is there any internal bug? Grateful for any help, thanks in advance!
5
0
515
Apr ’24
High Kernel Dispatch Overhead for Metal for Swift
I'm implementing a bitonic sort in Metal with a Swift app. This requires 100's kernel dispatch calls for each of the swap stages which touch the whole array, the work required by the GPU is small. I haven't been able to get this to run fast enough in Swift and it seems its due to a high overhead for each dispatchThread command. I rewrote the test program in Objective C with a super-simple kernel function and its runs 25x faster from Objective C! Kernel function kernel void fill(device uint8_t *array [[buffer(0)]], const device uint32_t &N [[buffer(1)]], const device uint8_t &value [[buffer(2)]], uint i [[thread_position_in_grid]]) { if (i < N) { array[i] = value; } } The Swift code is: func fill(pso:MTLComputePipelineState, buffer:MTLBuffer, N: Int, passes: Int) { guard let commandBuffer = commandQueue.makeCommandBuffer() else { return } let gridSize = MTLSizeMake(N, 1, 1) var threadGroupSize = pso.maxTotalThreadsPerThreadgroup if (threadGroupSize > N) { threadGroupSize = N; } let threadgroupSize = MTLSizeMake(threadGroupSize, 1, 1); for pass in 0..<passes { guard let computeEncoder = commandBuffer.makeComputeCommandEncoder() else { return } var value:UInt8 = UInt8(pass); var NN:UInt32 = UInt32(N); computeEncoder.setComputePipelineState(pso) computeEncoder.setBuffer(buffer, offset: 0, index: 0) computeEncoder.setBytes(&NN, length: MemoryLayout<UInt32>.size, index: 1) computeEncoder.setBytes(&value, length: MemoryLayout<UInt8>.size, index: 2) computeEncoder.dispatchThreadgroups(gridSize, threadsPerThreadgroup: threadgroupSize) computeEncoder.endEncoding() } commandBuffer.commit() commandBuffer.waitUntilCompleted() } let device = MTLCreateSystemDefaultDevice()! let library = device.makeDefaultLibrary()! let commandQueue = device.makeCommandQueue()! let funcFill = library.makeFunction(name: "fill")! let pso = try? device.makeComputePipelineState(function: funcFill) var N = 16384 let passes = 100 let buffer = device.makeBuffer(length:N, options: [.storageModePrivate])! for _ in 1...10 { let startTime = DispatchTime.now() fill(pso:pso!, buffer:buffer, N:N, passes:passes) let endTime = DispatchTime.now() let elapsedTime = endTime.uptimeNanoseconds - startTime.uptimeNanoseconds print("Elapsed time:", Float(elapsedTime)/1_000_000, "ms"); } and the Objective C code (which should be almost identical) is void fill(id<MTLCommandQueue> commandQueue, id<MTLComputePipelineState> funcPSO, id<MTLBuffer> A, uint32_t N, int passes) { id<MTLCommandBuffer> commandBuffer = [commandQueue commandBuffer]; MTLSize gridSize = MTLSizeMake(N, 1, 1); NSUInteger threadGroupSize = funcPSO.maxTotalThreadsPerThreadgroup; if (threadGroupSize > N) { threadGroupSize = N; } MTLSize threadgroupSize = MTLSizeMake(threadGroupSize, 1, 1); for(uint8_t pass=0; pass<passes; pass++) { id<MTLComputeCommandEncoder> computeEncoder = [commandBuffer computeCommandEncoder]; [computeEncoder setComputePipelineState:funcPSO]; [computeEncoder setBuffer:A offset:0 atIndex:0]; [computeEncoder setBytes:&N length:sizeof(uint32_t) atIndex:1]; [computeEncoder setBytes:&pass length:sizeof(uint8_t) atIndex:2]; [computeEncoder dispatchThreads:gridSize threadsPerThreadgroup:threadgroupSize]; [computeEncoder endEncoding]; } [commandBuffer commit]; [commandBuffer waitUntilCompleted]; } int main() { NSError *error; id<MTLDevice> device = MTLCreateSystemDefaultDevice(); id<MTLLibrary> library = [device newDefaultLibrary]; id<MTLCommandQueue> commandQueue = [device newCommandQueue]; id<MTLFunction> funcFill = [library newFunctionWithName:@"fill"]; id<MTLComputePipelineState> pso = [device newComputePipelineStateWithFunction:funcFill error:&error]; // Prepare data int N = 16384; int passes = 100; id<MTLBuffer> bufferA = [device newBufferWithLength:N options:MTLResourceStorageModePrivate]; for(int it=1; it<=10; it++) { CFTimeInterval startTime = CFAbsoluteTimeGetCurrent(); fill(commandQueue, pso, bufferA, N, passes); CFTimeInterval duration = CFAbsoluteTimeGetCurrent() - startTime; NSLog(@"Elapsed time: %.1f ms", 1000*duration); } } The Swift output is: Elapsed time: 89.35556 ms Elapsed time: 63.243744 ms Elapsed time: 62.39568 ms Elapsed time: 62.183224 ms Elapsed time: 63.741913 ms Elapsed time: 63.59463 ms Elapsed time: 62.378654 ms Elapsed time: 61.746098 ms Elapsed time: 61.530384 ms Elapsed time: 60.88774 ms The objective C output is 2024-04-18 19:27:45.704 compute_test[3489:92754] Elapsed time: 3.6 ms 2024-04-18 19:27:45.706 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.709 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.712 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.714 compute_test[3489:92754] Elapsed time: 2.7 ms 2024-04-18 19:27:45.717 compute_test[3489:92754] Elapsed time: 2.8 ms 2024-04-18 19:27:45.720 compute_test[3489:92754] Elapsed time: 2.8 ms 2024-04-18 19:27:45.723 compute_test[3489:92754] Elapsed time: 2.7 ms 2024-04-18 19:27:45.726 compute_test[3489:92754] Elapsed time: 2.5 ms 2024-04-18 19:27:45.728 compute_test[3489:92754] Elapsed time: 2.5 ms I compile the Swift code for Release, optimised for speed. I can't believe there should be a difference here, so what could be different, and what might I be doing wrong? thanks Adrian
3
0
426
Apr ’24
Tabview not rendering correctly for a Menubar app
I'm building a Mac OSX Menubar app (build target is 14.0) and need a Settings window as part of it. I define the window as a standalone view in my @main block as follows: struct xyzApp: App { MenuBarExtra { MenubarView() } label: { Label("XYZ", image: "xyz") } .menuBarExtraStyle(.window) Window("Settings", id: "settings-window") { SettingsView() }.windowResizability(.contentSize) } The Settings view looks like this var body: some View { TabView { Form { }.tabItem { Label("Tab1",systemImage: "gear") } Form { }.tabItem { Label("Tab2",systemImage: "gear") } } } } However the Tabview is not being rendered correctly, there's no image and the sizing is wrong I tested the same code on a regular app with a Settings() declaration in the @main block and it works fine. Any points on what I'm doing wrong would be very helpful. Thanks!
1
0
240
Apr ’24
No Metrics available in MLJob
Hey, im training an MLImageClassifier via the train()-method: guard let job = try? MLImageClassifier.train(trainingData: trainingData, parameters: modelParameter, sessionParameters: sessionParameters) else{ debugPrint("Training failed") return } Unfortunately the metrics of my MLProgress, which is created from the returning MLJob while training are empty. Code for listening on Progress: job.progress.publisher(for: \.fractionCompleted) .sink{[weak job] fractionCompleted in guard let job = job else { debugPrint("failure in creating job") return } guard let progress = MLProgress(progress: job.progress) else { debugPrint("failure in creating progress") return } print("ProgressPROGRESS: \(progress)") print("Progress: \(fractionCompleted)") } .store(in: &subscriptions) Printing the Progress ends in: MLProgress(elapsedTime: 2.2328420877456665, phase: CreateML.MLPhase.extractingFeatures, itemCount: 32, totalItemCount: Optional(39), metrics: [:]) Got the Same result when listening to MLCheckpoints, Metrics are empty aswell: MLCheckpoint(url: URLPATH.checkpoint, phase: CreateML.MLPhase.extractingFeatures, iteration: 32, date: 2024-04-18 11:21:18 +0000, metrics: [:]) Can some1 tell me how I can access the metrics while training? Thanks!
0
0
386
Apr ’24