I was switching the Detector scanner from QR to Rectangle type of detection but on Rectangle type of detection I cannot take the feature.messageString out of CIRectangleFeature, while on CIQRCodeFeature it works. Only what I had changed was ofType: CIDetectorTypeQRCode into CIDetectorTypeRectangle and features as? [CIQRCodeFeature] into features as? [CIRectangleFeature]
here is the code
func processQRCodeImage(_ image: UIImage) {
var qrCodeLink = ""
let detector: CIDetector = CIDetector(ofType: CIDetectorTypeRectangle, context: nil, options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])!
let ciImage: CIImage = CIImage(image: image)!
let features = detector.features(in: ciImage)
if let features = features as? [CIRectangleFeature] {
for feature in features {
qrCodeLink += feature.messageString! // Value of type 'CIRectangleFeature' has no member 'messageString' COMPILE ERROR
}
}
if qrCodeLink.isEmpty {
failedQRCoderead()
} else {
found(code: qrCodeLink)
onBackPressed?()
}
}
Post
Replies
Boosts
Views
Activity
For some reason the AVAssetTrack nominalFrameRate is always 30 for high speed videos. 120 FPS and 240 FPS videos' nominalFrameRate property is always 30. However, 60 FPS videos' nominalFrameRate is 60. I'm reading in the video url through the PHPickerViewController. I have a configuration for the picker set up as follows.
configuration = PHPickerConfiguration(photoLibrary: PHPhotoLibrary.shared())
configuration.filter = .videos
configuration.selectionLimit = 0
and the gathering of the picker results as follows
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
for result in results {
var currentVideo = Video()
let provider = result.itemProvider
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in
guard error == nil else {
return
}
guard let url = url else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)"
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
// copy item to APP Storage
try? FileManager.default.copyItem(at: url, to: newUrl)
currentVideo.url = newUrl.absoluteString
self.parent.selectedVideos.append(currentVideo)
}
}
// Set isPresented to false because picking has finished.
parent.isPresented = false
}
I'm creating an AVAsset and AVAssetTrack to check the FPS of the video as follows.
var asset: AVAsset? = AVAsset(url: url)
if let asset = asset,
let videoTrack = try? await asset.loadTracks(withMediaType: .video).last {
let size = try? await videoTrack.load(.naturalSize)
let fps = try? await videoTrack.load(.nominalFrameRate)
let duration = try? await test.load(.duration)
print(fps) // This shows 30 for 120 fps and 240 fps videos
}
I'm not sure if there's some other configuration that needs to be set to handle high speed videos or what. I'm really confused.
I have an app that supports English and Arabic. If a German user has Arabic in the languages for their iPhone, the app is switching to Arabic, since that language is one of their languages.
How do I get it to only use the top most language (their most preferred language) they have set? So a German user that does not have English in their languages or maybe has it below Arabic, will still get English as their language
I must be missing something simple here. I have an application I'm writing and having trouble with shifts and 64 bit unsigned integers. Here's some playground code:
import UIKit
let a: UInt64 = 0xFFFFFFFF00000000
let b: UInt64 = 0x0000000000000000
print(String(format: "a = 0x%016X", a))
print(String(format: "b = 0x%016X", b))
print(String(format: "a | b = 0x%016X", a | b))
It produces this output:
a = 0x0000000000000000
b = 0x0000000000000000
a | b = 0x0000000000000000
Why is "a" not equal to 0xFFFFFFFF00000000?
I'm using XCode 14.3.1.
I also found issues shifting a byte into the upper 32 bits of a UInt64:
let c: UInt64 = 0x00000000000000FF
let d: UInt64 = 0x0000000000000000
print(String(format: "c = 0x%016X", c))
print(String(format: "d = 0x%016X", d))
print(String(format: "d | (c << 32) = 0x%016X", d | (c << 32)))
This produces this output:
c = 0x00000000000000FF
d = 0x0000000000000000
d | (c << 32) = 0x0000000000000000
From the fantastic video: https://developer.apple.com/videos/play/wwdc2023/10170/?time=298, would you have a downloadable link to that kitchen service project?
I'm trying to understand how the method:
func handleShift<Orders>(orders: Orders) async throws
is called?
Because the video shows it to be called this way:
for cook in staff.keys {
group.addTask { try await cook.handleShift() }
}
without arguments ...
Many thanks!
I implemented Sign in with Apple but in all cases the button is always black. I would like to show it in light/ dark mode depending on the phone settings.
This is my code:
class MyAuthorizationAppleIDButton: UIButton {
private var authorizationButton: ASAuthorizationAppleIDButton!
@IBInspectable
var cornerRadius: CGFloat = 3.0
@IBInspectable
var authButtonType: Int = ASAuthorizationAppleIDButton.ButtonType.default.rawValue
@IBInspectable
var authButtonStyle: Int = ASAuthorizationAppleIDButton.Style.black.rawValue
override public init(frame: CGRect) {
super.init(frame: frame)
}
required public init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override public func draw(_ rect: CGRect) {
super.draw(rect)
// Create ASAuthorizationAppleIDButton
authorizationButton = ASAuthorizationAppleIDButton(authorizationButtonType: .signIn, authorizationButtonStyle: .black)
let type = ASAuthorizationAppleIDButton.ButtonType.init(rawValue: authButtonType) ?? .default
let style = ASAuthorizationAppleIDButton.Style.init(rawValue: authButtonStyle) ?? .black
authorizationButton = ASAuthorizationAppleIDButton(authorizationButtonType: type, authorizationButtonStyle: style)
authorizationButton.cornerRadius = cornerRadius
// Show authorizationButton
addSubview(authorizationButton)
// Use auto layout to make authorizationButton follow the MyAuthorizationAppleIDButton's dimension
authorizationButton.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
authorizationButton.topAnchor.constraint(equalTo: self.topAnchor, constant: 0.0),
authorizationButton.leadingAnchor.constraint(equalTo: self.leadingAnchor, constant: 0.0),
authorizationButton.trailingAnchor.constraint(equalTo: self.trailingAnchor, constant: 0.0),
authorizationButton.bottomAnchor.constraint(equalTo: self.bottomAnchor, constant: 0.0),
])
}
}
So basically with the code above, I can set on Storyboard the style of the button but it seems that even if I change the value at my code, the result is based on what I chose on Storyboard's variable.
Is there any solution where I would be able to show the button in light/ dark mode depending on the phone settings ?
I have a WebView based iOS project where the keyboard is not rendering as expected on iOS 17 but works fine on devices with iOS 15,16. Attached are the images from simulator and device.
Please help.
I tried setting WebView contentInset in keyboardWillShow() but it did not work.
I'm a beginner in swift.
Ways I tried:
Tried adding a command line tool DNC observer to call a function when any screen sharing notification triggers, but later came to know that screen sharing doesn’t give any notifications.
import OSLog
import Foundation
os_log("TecMFA:: Starting screen sharing finder.")
let dnc = DistributedNotificationCenter.default()
dnc.addObserver(
forName: .init("com.apple.screensharing.server"), // tried many notification names like com.apple.screensharing.curtain etc.
object: nil,
queue: .main
) { notification in
os_log("TecMFA:: Started screen sharing deamon.")
}
dispatchMain()
Created a server using vapor as following
//configure.swift
import Vapor
func routes(_ app: Application) throws {
// Define a route to handle POST requests to "/login"
app.post("login") { req -> HTTPStatus in
// Read the username and password from the request body
guard let loginData = try? req.content.decode(LoginData.self) else {
// Failed to parse request body or invalid data
return .badRequest
}
let username = loginData.username
let password = loginData.password
print(username)
print(password)
// Do something with the username and password
print("Received login request with username: \(username) and password: \(password)")
// Return a success response
return .ok
}
}
// Define a struct to represent the request body data
struct LoginData: Content {
let username: String
let password: String
}
// routes.swift
import Vapor
import Foundation
func getLocalIPAddress() -> String? {
let task = Process()
task.launchPath = "/usr/sbin/ipconfig"
task.arguments = ["getifaddr", "en0"] // Use "en0" for Wi-Fi, "en1" for Ethernet
let pipe = Pipe()
task.standardOutput = pipe
task.launch()
let data = pipe.fileHandleForReading.readDataToEndOfFile()
let output = String(data: data, encoding: .utf8)?.trimmingCharacters(in: .whitespacesAndNewlines)
return output
}
// Called before your application initializes.
public func configure(_ app: Application) throws {
// Register routes
try routes(app)
// Get the local IP address
guard let localIPAddress = getLocalIPAddress() else {
fatalError("Unable to get the local IP address.")
}
// Update the server configuration to bind to the local IP address and desired port
app.http.server.configuration.hostname = localIPAddress
app.http.server.configuration.port = 8080
}
It didn't work when same port numbers. I tried using different port numbers but the request comes through port 5900, so 8080 cannot access it, so it didn't work either.
Any corrections and suggestions are welcome.
I am learning how to access SQLite without the use of Swift Data. I have a function that tests to see if a table exists and it works. The problem is that I do not understand why it works. It is my understanding that the SELECT statement returns a 0 if the table does not exist and 1 if it does. But the API step statement returns 101 (SQLITE_Done) if the table does not exist. But if the table does exist it does not return 101. Per the SQLite documentation it would appear that 101 means the operation has completed. What am I missing here and is there a way to capture the underlying SQLite 0 or 1 which would allow me to test for that? Below is my function.
func doesTableExist(db: OpaquePointer?) -> Bool {
var tableExists: Bool = true
let testForTable = """
SELECT name FROM sqlite_master
WHERE type='table'
AND name='Contact';
"""
var testForTablePtr: OpaquePointer?
if sqlite3_prepare_v2(db, testForTable, -1, &testForTablePtr, nil) == SQLITE_OK {
if sqlite3_step(testForTablePtr) == SQLITE_DONE {
tableExists = false
}
} else {
print("unable to compile sql statement testing to see if table exists")
}
return tableExists
}
This is new in Xcode 15 beta 5. Command SwiftCompile emitted errors but did not return a nonzero exit code to indicate failure.
error: Invalid Swift parseable output message (malformed JSON): `0` (in target 'HiCoffee' from project 'HiCoffee')
error: Invalid Swift parseable output message (malformed JSON): `1` (in target 'HiCoffee' from project 'HiCoffee')
error: Invalid Swift parseable output message (malformed JSON): `{
"kind": "finished",
` (in target 'HiCoffee' from project 'HiCoffee')
"name": "compile",
"pid": -1139,
"process": {
"real_pid": 95136
},
"exit-status": 0
}
Command SwiftCompile emitted errors but did not return a nonzero exit code to indicate failure
Hello guys!
I am trying to run my application on a physical iPad device. The app is successfully installed and when I run the code, it "Builds successfully", but when it starts, the screen is just dark and on the Xcode file it says "Paused app on iPad".
i don't know what else to do. can anyone help me with this? Thank you!
Recently, we are testing our app on iOS 17, and found a weird crash that occurred on iOS 17, not on iOS 16 and below.
Here's the stack:
the sha512 is a function we wrote for the String extension:
public extension String {
var sha512: String? {
return self.data(using: .utf8).map { NSData(data: $0).sha512() }
}
}
We cannot find any mistake from the code above, and the code works well on iOS 16 and below (packaged by Xcode 14.2).
And it also works on:
iOS 17 simulator running from Xcode 14.2.
iOS 17 simulator running from Xcode 15 Beta 4.
iOS 17 beta 3/4 real device running from Xcode 15 Beta 4 (debug).
It doesn't works on:
packaged by Xcode 14.2 and running on iOS 17 beta real device.
not tested:
packaged by Xcode 15 Beta and running on iOS 17 beta real device.
Is that an iOS 17 bug? Or anyone may know how to fix that?
I will very appreciate your advice, thank you.
I've been trying to get the bash/script version of DeepFaceLab to work with Apple Silicon Macs, but this was original a Windows project that even now has non-existent support for MacOs/Apple Silicon. I am thinking of converting everything into a native macOS app using Swift, specifically optimized for Apple Silicon GPUs.
Here's what I got from ChatGPT. Any help/advice on how to do this would be greatly appreciated. I don't have any Swift programming experience, but I have experience with some coding and can generally figure things out. I know that this is probably not feasible for a single individual with little programming experience, but I wanted to throw this out there to see what others think. Thank you
Here's a high-level overview of the steps involved in porting DeepFaceLab to Swift with a graphical UI:
Understand DeepFaceLab: Thoroughly study the DeepFaceLab project, its Python scripts, and the overall architecture to grasp its functionalities and dependencies.
Choose a Swift Framework: Decide on the UI framework you want to use for the macOS app. SwiftUI is Apple's latest UI framework that works across all Apple platforms, including macOS. Alternatively, you can use AppKit for a more traditional approach.
Rewrite Python to Swift: Convert the Python code from DeepFaceLab into Swift. You'll need to rewrite all the image processing, deep learning, and video manipulation code in Swift, potentially using third-party Swift libraries or native macOS frameworks.
Deep Learning Integration: Replace the Python-based deep learning library used in DeepFaceLab with an appropriate Swift-compatible deep learning framework. TensorFlow and PyTorch both offer Swift APIs, but you may need to adapt the specific model implementation to Swift.
Image Processing: Find equivalent Swift libraries or frameworks for image processing tasks used in DeepFaceLab.
UI Development: Design and implement the graphical user interface using SwiftUI or AppKit. You'll need to create views, controls, and navigation elements to interact with the underlying Swift code.
Integration: Connect the Swift code with the UI components, ensuring that actions in the GUI trigger the appropriate Swift functions and display results back to the user.
Testing and Debugging: Rigorously test the Swift application and debug any issues that arise during the porting process.
Optimization: Ensure that the Swift app performs efficiently and effectively on macOS devices.
After updating to xcode 14.3 with RN 0.69.9, my project can't build anymore. It's throwing the error for all of my custom view managers.
The error i'm getting is: A function declaration without a prototype is deprecated in all versions of C when I'm using RCT_EXTERN_MODULE
Output of npx react-native info:
System:
OS: macOS 13.4.1
CPU: (8) arm64 Apple M1 Pro
Memory: 77.59 MB / 16.00 GB
Shell: 5.9 - /bin/zsh
Binaries:
Node: 16.13.1 - ~/.nvm/versions/node/v16.13.1/bin/node
Yarn: 1.22.19 - ~/.nvm/versions/node/v16.13.1/bin/yarn
npm: 8.1.2 - ~/.nvm/versions/node/v16.13.1/bin/npm
Watchman: 2023.07.10.00 - /opt/homebrew/bin/watchman
Managers:
CocoaPods: 1.12.1 - /Users/avarisco/.rvm/gems/ruby-2.7.4/bin/pod
SDKs:
iOS SDK:
Platforms: DriverKit 22.4, iOS 16.4, macOS 13.3, tvOS 16.4, watchOS 9.4
Android SDK: Not Found
IDEs:
Android Studio: Not Found
Xcode: 14.3/14E222b - /usr/bin/xcodebuild
Languages:
Java: Not Found
npmPackages:
@react-native-community/cli: Not Found
react: 18.0.0 => 18.0.0
react-native: 0.69.9 => 0.69.9
react-native-macos: Not Found
npmGlobalPackages:
react-native: Not Found
I have a MacOS menu bar application and I'm trying to display an NSPopover in the center of the screen. This is the code I've come up with so far:
class AppDelegate: NSObject, NSApplicationDelegate, ObservableObject {
private var statusItem: NSStatusItem!
private var popover: NSPopover!
@MainActor func applicationDidFinishLaunching(_ notification: Notification) {
// ... init of statusItem hidden
self.popover = NSPopover()
self.popover.contentSize = NSSize(width: 300, height: 300)
self.popover.behavior = .transient
self.popover.contentViewController = NSHostingController(rootView: ContentView())
}
@objc func togglePopover() {
if let button = statusItem.button {
if popover.isShown {
self.popover.performClose(nil)
} else {
if let screen = NSScreen.main {
let screenFrame = screen.visibleFrame
let x = screenFrame.origin.x + (screenFrame.width - self.popover.contentSize.width) / 2
let y = screenFrame.origin.y + (screenFrame.height - self.popover.contentSize.height) / 2
let centerPoint = NSPoint(x: x, y: y)
popover.show(relativeTo: NSRect(origin: centerPoint, size: CGSize.zero), of: screenFrame, preferredEdge: NSRectEdge.minX)
}
}
}
}
}
Unfortunately the code above does not compile because the of parameter of popover.show expects an NSView, whereas screenFrame is an NSRect. I've tried creating a view out of the rect but this leads to a run-time error that the view is not contained within a window.
Any thoughts?
I am going through the Swift Tour and found an error in the second code snippet under Control Flow.
Current
var teamScore = 11
let scoreDecoration = if teamScore > 10 {
"🎉"
} else {
""
}
Error: Consecutive statements on a line must be separated by ';'
Proposed
var teamScore = 11
let scoreDecoration = teamScore > 10 ? "🎉" : ""
what is the easiest way to make passkey from app?I have searched every where I could, but there is no tutorial on how to generate passkey from app itself.
I am trying to develop new XCUITest project and my requirement is to do parallel distributed testing. Such that i have 4 iPhones and 6 classes. Each class should run once and if it is executed then it should not be executed on any other iPhone.
Example:
class to be executed: Class1, Class2, Class3, Class4, Class5, Class6
No of iPhones: iPhone1, iPhone2, iPhone3, iPhone4
Now execution should be like this
iPhone1: Class1, Class5
iPhone2: Class2, Class6
iPhone3: Class3
iPhone4: Class4
I have tried test plan, fastlane & xcodebuild still not able to find the solution. Does anyone has any idea how can we achieve this or if someone has implemented this in their project.
Any help would be great.
Hello,
I'm developing a MacOS application in which I want to store some credentials on the keychain. The implementation is working, but anytime I run the application to test after I made some changes to the code, the keychain requests to enter my password.
Seems like "Always allow" only works until I change the code. It is really annoying, because it is slowing down my development significantly. Any time I make UI changes, if I want to test it I need to enter my 20 character long password...
My application loads data from the network, and the credentials are required every time I run the app, so I can't really disable the keychain during development (or not easily at least)...
Is it possible to make my application as "trusted" during development, so I don't need to enter my password to retrieve the credentials from Keychain?
For reference, I created a GitHub repo that demonstrates the problem: https://github.com/ferenc-nagy/macos-swift-keychain-experiment
Each time I change the text in this UI element, I will have to re-enter my password when running the app: https://github.com/ferenc-nagy/macos-swift-keychain-experiment/blob/main/KeychainExperiment/Views/ContentView.swift#L15
Voiceover or keyboard focus not moving to PHPViewController when presented.
When I present the PHPViewController controller, the voiceover focus does not move automatically to the present view controller unless tapped on the screen (or selected any element).