Good news bad news.
Good news- I have built my first app clip! After getting our app submission accepted, it gave us a working "default app clip url" in which successfully launches our app clip card and app clip.
Bad news- All this work was done to associate our app clip link with our website, so we could have a very clean URL, but that url is not launching our app clip card or the clip. Everything points to it looking good:
Diagnostics on apple developer settings are all green checkmarks: associated domains, app clip published on app store, smart app banner.
My associated domain url is "validated" on app store connect
My website has a smart app banner with meta tag with bundle identifier, and a open graph photo configured.
My app clip has the domain in it's entitlements file
What I'm expecting: Sending a text with the website's url should show me my app clip card, and not open my website, instead the app clip.
I shouldn't need to configure an advanced app clip experience because it's just via Messenger.. right? According to the documentation, advanced experiences should be for maps, qr codes, etc, right?
From what it seems... everything is set up completely right... so how come when I send myself a text message with the website's URL, it's not popping up with the app clip card?
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
I want to convert CGPoint into SCNVector3. I am using ARFaceTrackingConfiguration for face tracking.
Below is my code to convert SCNVector3 to CGPoint
let point = faceAnchor.verticeAndProjection(to: sceneView, facePoint: faceAnchor.geometry.vertices[0])
print(point, faceAnchor.geometry.vertices[0])
which prints below values
CGPoint = (350.564453125, 643.4456787109375)
SIMD3<Float>(0.014480735, 0.01397189, 0.04508282)
extension ARFaceAnchor{
// struct to store the 3d vertex and the 2d projection point
struct VerticesAndProjection {
var vertex: SIMD3<Float>
var projected: CGPoint
}
// return a struct with vertices and projection
func verticeAndProjection(to view: ARSCNView, facePoint: Int) -> CGPoint{
let point = SCNVector3(geometry.vertices[facePoint])
let col = SIMD4<Float>(SCNVector4())
let pos = SIMD4<Float>(SCNVector4(point.x, point.y, point.z, 1))
let pworld = transform * simd_float4x4(col, col, col, pos)
let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z))
let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y))
return p
}
}
extension matrix_float4x4 {
/// Get the position of the transform matrix.
public var position: SCNVector3 {
get{
return SCNVector3(self[3][0], self[3][1], self[3][2])
}
}
}
Now i want to convert same CGPoint to SCNVector3.
I tried using below code but it is not giving expected values, which is SIMD3(0.014480735, 0.01397189, 0.04508282)
let projectedOrigin = sceneView.projectPoint(SCNVector3Zero)
let unproject = sceneView.unprojectPoint(SCNVector3(point.x, point.y, CGFloat(projectedOrigin.z)))
let vector = SCNVector3(unproject.x, unproject.y, unproject.z)
Is there any way to convert CGPoint to SCNVector3? I cannot use hitTest because this CGPoint is not present on the node. It is present somewhere on the face area.
demo code :
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
// Flip the coordinate system
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextTranslateCTM(context, 0, self.bounds.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
NSDictionary *attrs = @{NSFontAttributeName: [UIFont systemFontOfSize:20],
NSForegroundColorAttributeName: [UIColor blueColor],
NSUnderlineStyleAttributeName: @(NSUnderlineStyleThick),
};
// Make an attributed string
NSAttributedString *attributedString = [[NSAttributedString alloc] initWithString:@"Hello CoreText!" attributes:attrs];
CFAttributedStringRef attributedStringRef = (__bridge CFAttributedStringRef)attributedString;
// Simple CoreText with CTFrameDraw
CTFramesetterRef framesetter = CTFramesetterCreateWithAttributedString(attributedStringRef);
CGPathRef path = CGPathCreateWithRect(self.bounds,NULL);
CTFrameRef frame = CTFramesetterCreateFrame(framesetter,CFRangeMake(0, 0),path,NULL);
//CTFrameDraw(frame, context);
// You can comment the line 'CTFrameDraw' and use the following lines
// draw with CTLineDraw
CFArrayRef lines = CTFrameGetLines(frame);
CGPoint lineOrigins[CFArrayGetCount(lines)];
CTFrameGetLineOrigins(frame, CFRangeMake(0, 0), lineOrigins);
for (int i = 0; i < CFArrayGetCount(lines); i++) {
CTLineRef line = CFArrayGetValueAtIndex(lines, i);
CGContextSetTextPosition(context, lineOrigins[i].x, lineOrigins[i].y);
// CTLineDraw(line, context);
// You can comment the line 'CTLineDraw' and use the following lines
// draw with CTRunDraw
// use CTRunDraw will lost some attributes like NSUnderlineStyleAttributeName,
// so you need draw it by yourself
CFArrayRef runs = CTLineGetGlyphRuns(line);
for (int j = 0; j < CFArrayGetCount(runs); j++) {
CTRunRef run = CFArrayGetValueAtIndex(runs, j);
CTRunDraw(run, context, CFRangeMake(0, 0));
}
}
}
this code will use CTRunDraw to draw the content , and the underline will draw and show normally in iOS17 & Xcode 15 , But when you build it with XCode16 & iOS18 beta . the underline will be missing .
I'm trying to add an SVG image to my launch screen. The SVG image is working fine in the main storyboard also used in a UIImageView, but the launch screen remains completely black; the launch screen is set with white background, so it seems to be completely ignored. When I remove the image from the UIImageView the launch screen is shown with correct background color but of course without the whished image. I can also correctly implement text in the launch screen, the launch screen shows the text and the background color correctly. As soon as I define an image from the asset catalogue for the UIImageView in the launch screen, the launch screen is completely black not showing anything. I tried also with a simple png image-set instead of the SVG image, but still the same issue. How can I implement a SVG image in my launch screen?
I tried the ScreenCaptureKit sample code from Apple:
ScreenCaptureKit Sample Code
When I ran it for a while, it crash at the strange position as attached screenshot.
The value array is not empty and has value at index 0 but it crashed.
The social engineering is also driving me mad. The foreign direct interference is also driving me mad.
I was trying to submit a student loan application and the system starts telling me illogical constraints based on the page design which previously wasn’t an issue or a problem. However is now a problem and if I don’t complete the page design work flows the funds are not released to the university advertising the masters degree services.
also there seems to be a variety of universities simultaneously all competing for the same accessibility and or student loan accessibility and it’s quite difficult to decide which one to go with and or which masters degree course to take? Which is less of a problem then the ui/ux bug of not being able to complete and or submit the entire funding application which feels more like malware targeting than actual usability conformation practices.
must I put in the funding website? And the university what about privacy?
Our app has previously not supported dark mode and we had the "Appearance" entry in our Info.plist set to "Light".
We are now about to release an update that enables dark mode support. To enable this we have:
Added a preference to our app's settings screen that lets users choose between System, Light and Dark options.
Based on the user's preference, we set the entire app's preferred color scheme using the SwiftUI .preferedColorScheme modifier on our root view.
Removed the "Appearance" entry from our Info.plist
This is all tested and working in our local development builds. We are now testing out the app for release using an internal TestFlight build and we've run into a problem - after initially updating the app, it does not seem to detect the change to the Info.plist and the app remains in light mode even if you change the preferred colour scheme.
If you force quite the app from the app switched and re-launch it, the colour scheme preference starts working as expected.
This is going to be an issue for our users because when they update the app it is going to look like the new color scheme setting does not work. Having to ask customers to force quit the app from the app switcher is not really an acceptable workaround.
I'm not sure this is specifically tied to the app process being killed because I would expect that to happen anyway when the app is updated. I'm wondering if this is related to the system caching the UISceneSession for the app and the act of force killing it from the app switcher is what causes the cached session to be created.
Is this a known issue and is there any way to solve this?
I'm trying to display overlay on screen by following code:
NSRect windowRect = [[NSScreen mainScreen] frame];
self.overlayWindow = [[NSWindow alloc] initWithContentRect:windowRect
styleMask:NSWindowStyleMaskBorderless
backing:NSBackingStoreBuffered
defer:NO
screen:[NSScreen mainScreen]];
[self.overlayWindow setReleasedWhenClosed:YES];
[self.overlayWindow setBackgroundColor:[NSColor colorWithCalibratedRed:0.0
green:1.0
blue:0.0
alpha:0.1]];
[self.overlayWindow setAlphaValue:1.0];
[self.overlayWindow setOpaque:NO];
[self.overlayWindow setIgnoresMouseEvents:NO];
[self.overlayWindow makeKeyAndOrderFront:nil];
self.overlayWindow.ignoresMouseEvents = YES;
self.overlayWindow.level = NSScreenSaverWindowLevel;
self.overlayWindow.collectionBehavior = NSWindowCollectionBehaviorCanJoinAllSpaces | NSWindowCollectionBehaviorCanJoinAllApplications;
But when other APP enter full screen, the overlay disappears even I set the collectionBehavior with option NSWindowCollectionBehaviorCanJoinAllApplications. Is it possible to display a overlay on top of all other APPs?
Hello,
i'm actually working on an app built with reactJs, i can access it from all the plattforms(Fedora, Android, Windows, Mac OS) very good, but on iphone i just get a blank page, with this error. Any help about where it comes from or how i can debug it will be really appreciated.
Versions:
react: v18.2.0
iOS: v17.5.1
Host: Hostinger
Why does PDFKit delete signature widgets that have already been digitally signed?
This should not happen.
Is there an undocumented flag that needs to be set so that PDFKit doesn't remove them when loading or saving the PDF?
It's difficult to tell if it is happening at
PDFDocument(url: fileURL)
or
document.write(to: outputURL)
If a document is signed and still allows annotations, form filling, comments, etc. we should be able to load the PDF into a PDFDocument and save it without losing the certs.
Instead the certs are gone and only the signature annotation widgets are present.
Here is a simple example of loading and then saving the PDF with out any changes and it shows that the data is actually being changed...
...
import UIKit
import PDFKit
class ViewController: UIViewController {
var pdfView: PDFView!
@IBOutlet weak var myButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
pdfView = PDFView(frame: self.view.bounds)
pdfView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
self.view.addSubview(pdfView)
self.view.bringSubviewToFront(myButton)
// Load and compare the PDF data
if let originalData = loadPDF() {
if let loadedData = getRawDataFromLoadedPDF() {
let isDataEqual = comparePDFData(originalData, loadedData)
print("Is original data equal to loaded data? \(isDataEqual)")
}
}
}
@IBAction func onSave(_ sender: Any) {
if let savedData = savePDF() {
if let originalData = loadPDF() {
let isDataEqual = comparePDFData(originalData, savedData)
print("Is original data equal to saved data? \(isDataEqual)")
}
}
}
func loadPDF() -> Data? {
guard let fileURL = Bundle.main.url(forResource: "document", withExtension: "pdf") else {
print("Error: document.pdf not found in bundle.")
return nil
}
do {
let originalData = try Data(contentsOf: fileURL)
if let document = PDFDocument(url: fileURL) {
pdfView.document = document
print("PDF loaded successfully.")
return originalData
} else {
print("Error: Unable to load PDF document.")
return nil
}
} catch {
print("Error reading PDF data: \(error)")
return nil
}
}
func getRawDataFromLoadedPDF() -> Data? {
guard let document = pdfView.document else {
print("Error: No document is currently loaded in pdfView.")
return nil
}
if let data = document.dataRepresentation() {
return data
} else {
print("Error: Unable to get raw data from loaded PDF document.")
return nil
}
}
func comparePDFData(_ data1: Data, _ data2: Data) -> Bool {
return data1 == data2
}
func savePDF() -> Data? {
guard let document = pdfView.document else {
print("Error: No document is currently loaded in pdfView.")
return nil
}
let fileManager = FileManager.default
let urls = fileManager.urls(for: .documentDirectory, in: .userDomainMask)
guard let documentsURL = urls.first else {
print("Error: Could not find the documents directory.")
return nil
}
let outputURL = documentsURL.appendingPathComponent("document_out.pdf")
if document.write(to: outputURL) {
print("PDF saved successfully to \(outputURL.path)")
do {
let savedData = try Data(contentsOf: outputURL)
return savedData
} catch {
print("Error reading saved PDF data: \(error)")
return nil
}
} else {
print("Error: Unable to save PDF document.")
return nil
}
}
}
I'm a macOS app developer, and I'm facing an issue with custom notification sounds in my app. After upgrading the app to include new custom notification sounds, the changes do not reflect until the system is restarted. The sounds do not update immediately after the app upgrade.
Is there a way to refresh or reload the custom notification sounds without needing a full system restart? Any guidance or best practices to handle this would be greatly appreciated.
Thank you!
Hello,
I'm a macOS app developer, and I'm facing an issue with custom notification sounds in my app. After upgrading the app to include new custom notification sounds, the changes do not reflect until the system is restarted. The sounds do not update immediately after the app upgrade.
Is there a way to refresh or reload the custom notification sounds without needing a full system restart? Any guidance or best practices to handle this would be greatly appreciated.
Thank you!
I am currently running iOS 18 Beta 3 and am working on enabling users to paste (and copy) custom emojis (AdaptiveImageGlyph, such as Memoji, Stickers, and soon GenMoji) into a text field.
I am looking for the UTI for AdaptiveImageGlyph—something similar to "public.adaptive-image-glyph". Does anyone know if such a UTI exists?
Here’s my situation: When typing AdaptiveImageGlyph using the System keyboard, everything functions correctly. However, if I copy some text containing AdaptiveImageGlyph from the Notes app and paste it into my playground app, it only pastes the text. The reverse is also true. In fact, if I copy some AdaptiveImageGlyph from the playground app and paste it, it only pasts the text.
Interestingly, copying AdaptiveImageGlyph from the Notes app and pasting it into iMessage works flawlessly, and vice versa. I am trying to achieve the same seamless functionality in my app.
Given that this feature works in iMessage and Notes, I am inclined to believe the issue might be on my side, though I recognize these are system apps and not third-party.
Example Code:
import SwiftUI
import UIKit
struct AdaptiveImageGlyphTextView: UIViewRepresentable {
class Coordinator: NSObject, UITextViewDelegate {
var parent: AdaptiveImageGlyphTextView
init(parent: AdaptiveImageGlyphTextView) {
self.parent = parent
}
func textViewDidChange(_ textView: UITextView) {
parent.text = textView.text
}
func textView(_ textView: UITextView, shouldChangeTextIn range: NSRange, replacementText text: String) -> Bool {
// Handle insertion of adaptive image glyphs here if needed
return true
}
}
@Binding var text: String
func makeCoordinator() -> Coordinator {
Coordinator(parent: self)
}
func makeUIView(context: Context) -> UITextView {
let textView = UITextView()
textView.delegate = context.coordinator
textView.supportsAdaptiveImageGlyph = true
textView.isEditable = true
textView.isSelectable = true
textView.font = UIFont.systemFont(ofSize: 17)
// Enable paste with NSAdaptiveImageGlyphs
textView.pasteConfiguration = UIPasteConfiguration(acceptableTypeIdentifiers: [
"public.text",
"public.image",
"public.adaptive-image-glyph" // Replace with the correct UTI if different
])
return textView
}
func updateUIView(_ uiView: UITextView, context: Context) {
if uiView.text != text {
uiView.text = text
}
}
}
struct ContentView: View {
@State private var text: String = ""
var body: some View {
AdaptiveImageGlyphTextView(text: $text)
.frame(height: 200)
.padding()
}
}
#Preview {
ContentView()
}
Issue Description
We have an azure pipeline that runs iOS UI tests. These tests can be triggered by a PR, CI, or manually.
The UI tests run without issues when triggered by a PR, but they encounter the following error when triggered by CI or manually. All configurations and macOS images are identical. Could someone help us understand the issue and suggest a solution?
2024-07-12 21:13:19.217 xcodebuild[7872:72894] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:13:19.547 xcodebuild[7872:73563] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:16:41.608 xcodebuild[7872:72894] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:16:41.654 xcodebuild[7872:73563] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 766.921 elapsed -- Testing started completed.
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 0.000 sec, +0.000 sec -- start
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 766.921 sec, +766.921 sec -- end
2024-07-12 21:20:06.478 xcodebuild[7872:77327] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:20:06.575 xcodebuild[7872:79397] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
I'm trying to capture all trackpad events at OS level and disable few of them - say the ones in left half of trackpad. Following this question, I could level listen to events in current window view with following code.
final class AppKitTouchesView: NSView {
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
// We're interested in `.indirect` touches only.
allowedTouchTypes = [.indirect]
// We'd like to receive resting touches as well.
wantsRestingTouches = true
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
private func handleTouches(with event: NSEvent) {
// 1. Change `in` parameter to listen events at OS level
// 2. Disable all events with `touch.normalizedPosition.x < 0.5`
let touches = event.touches(matching: .touching, in: self)
}
override func touchesBegan(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesEnded(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesMoved(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesCancelled(with event: NSEvent) {
handleTouches(with: event)
}
}
I'd to accomplish two things further.
Change in parameter to listen events at OS level
Disable all touch events on some condition - say touch.normalizedPosition.x < 0.5
I have a label in my project that takes up most of the screen. The label displays text generated from a variable.
I am trying to have the text size in the label to auto-adjust in size to fill up the entire dimension.
We noticed that AVPlayerViewController does not always show the "Multi-channel" label in the audio setting in the player when playing a video asset with surround sound as an audio track. (see image)
We only serve in the HLS master manifest a multichannel audio track, like this
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio_0",CHANNELS="6",NAME="Surround",LANGUAGE....
Different tvOS versions will give us different outcomes on whether or not the "multi-channel" label is shown
DOES NOT SHOW (the label Multi-channel will not show)
Model A1842 (tvOS v 17.5.1)
Model A1625 (tvOS v 16.6)
DOES SHOW (see image)
Model A1625 (tvOS v 15.6)
This gives us the impression that the label being shown depends on tvOS version.. Any reason why? This is an ideal way for the user to see that the audio track has surround..
I have the same issue. It was reported to Apple in November, 2023. Anyone else see this as a huge issue?
In WWDC24 video https://developer.apple.com/videos/play/wwdc2024/10131/ Jennifer mentioned that the app she's talking about would be available in the link below, but there's no such link provided.
Where the mentioned app code could be found?
Hello!
I'm building a Countdown Timer for the Dynamic Island using a Live Activity.
I have two issues for which I can't find any solution:
We want to display the time in the "X minutes" format, like in this example. I went through the forum but all the answers were wrong, because they were using a Text(:format) which never updates in the live activity, or a Text(:timerInterval) which we can't format.
I want the Live Activity to end once the timer gets to zero. I found this staleDate parameter that I thought would helped, but is actually only adding loaders on my design once the date is reached. I tried to implement a solution like the answer of this post, but the if (context.isStale) {...} part is never being rendered. It also looks like the stale sate gets activated only when the app is focus again.
I tried several fixes, went through a lot of forum and posts, but I can't find any solution.
Thanks!