Post not yet marked as solved
Has the problem of the black background when taking a picture with the AR Quick look function in iOS 15 been resolved? I think it's a rather serious bug.
On an iOS 15 device, go to the Quick Look page below and display any of the multiple 3D models in AR Quick Look; the background will be black when you take a picture with the AR Quick Look shooting function.
https://developer.apple.com/augmented-reality/quick-look/
There are similar events below, but they do not seem to be addressed at all.
https://developer.apple.com/forums/thread/691784
Post not yet marked as solved
With iOS 15 there are changes to keyboard handling. Besides usual overriding of keyCommands a programmer has to assign own commands higher priority than system commands.
So, the following code works for me:
final class MyViewController: UIViewController {
override var keyCommands: [UIKeyCommand]? {
let action = UIKeyCommand(input: UIKeyCommand.inputUpArrow, modifierFlags: [], action: #selector(myAction))
if #available(iOS 15, *) {
action.wantsPriorityOverSystemBehavior = true
}
return [action]
}
// rest of the code
}
On the other hand, almost the same code but with inheriting from QLPreviewController doesn't work:
final class PreviewViewController: QLPreviewController {
// same internals as above
}
In view hierarchy I spotted that there is actually some kind of extra navigation, maybe that is stealing my keyboard presses, maybe that's why above code isn't working.
Does anyone have an idea how to fix or workaround this issue? Obviously, the extra navigation is from Apple and I don't have access to these extra presented controllers.
I was trying to print the following details, but neither helps:
print(presentedViewController ?? "none") // output: none
print(topMostViewController) // output: <MyApp.PreviewViewController: 0x7fbe44156203>
print(inputViewController ?? "none") // output: none
It's worth to add I've also tried to add method
override func pressesBegan(_ presses: Set<UIPress>, with event: UIPressesEvent?) {
print("test?")
super.pressesBegan(presses, with: event)
}
And test never appears in the console, whatever button I press. So clearly something else steals the key presses. This did not happen before iOS 15.
I conclude it, thinking it may be a bug in the API. Probably just no one thought of this scenario when adding the new keys..?
Post not yet marked as solved
Recently, I am learning some knowledge about OpenCV on Mac, refer to the online tutorial to configure OpenCV of C++ on Xcode, and implement some simple examples on Xcode.
The biggest problem encountered is how to preview the image in debug mode. The cv::Mat image cannot be previewed directly, which is very unfriendly for debugging. I also found some solutions to this problem on the Internet, and these results are disappointing, at least for me as a programming beginner. I hope that this function implemented by xcode has the corresponding "OpenCV Image Viewer" plug-in on other IDEs on Mac such as Pycharm and CLion, and the "Image Watch" plug-in corresponding to Visual Studio on Windows.
For me, this feature is necessary. According to this idea, is it a wrong choice to learn OpenCV C++ with Xcode, and embracing CLion on a Mac is the best choice for me? (Considering that Xcode is an IDE for Swift, I don't know much about Swift.) The process of implementing these simple examples on Xcode is still very enjoyable. I like Xcode's UI design. Therefore, I still hope that there is the best way to solve this problem in Xcode.
The following is an example of running C++ OpenCV on Xcode:
Xcode Version: 13.2.1 (13C100)
OpenCV Version: stable 4.5.3 (bottled)
Post not yet marked as solved
I'm seeing an issue using [UIView drawViewHierarchyInRect:afterScreenUpdates:] to take an app "screenshot". Specifically, I'm seeing that a PDF rendered using QLPreviewController doesn't render in the screenshot when the code is run on a device.
I'm using this UIView API in conjunction with UIGraphicsBeginImageContextWithOptions and UIGraphicsEndImageContext - the simplified code is as follows:
UIWindow *appWindow = [[[UIApplication sharedApplication] delegate] window];
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 1.0);
CGContextRef graphicsContext = UIGraphicsGetCurrentContext();
BOOL result = [appWindow drawViewHierarchyInRect:screenBounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
and I'm setting up my QLPreviewController as follows:
- (void)pdfButtonTapped:(nonnull UIButton *)button {
QLPreviewController *nextViewController = [QLPreviewController new];
nextViewController.dataSource = self;
[self presentViewController:nextViewController animated:YES completion:nil];
}
#pragma mark - QLPreviewControllerDataSource
- (NSInteger)numberOfPreviewItemsInPreviewController:(nonnull QLPreviewController *)controller {
return 1;
}
- (nonnull id<QLPreviewItem>)previewController:(nonnull QLPreviewController *)controller previewItemAtIndex:(NSInteger)index {
return [NSBundle.mainBundle URLForResource:@"menu_web_november" withExtension:@"pdf"];
}
I'm running this code on both device and simulator:
Device: iPad (6th generation), iOS 15.2 [Screen 1024x768]
Simulator: iPad (9th generation), iOS 15.2, Xcode 13.2.1 (13C100) [Screen 1080x810]
The results I'm seeing differ between simulator and device.
On device, I see that only the navigation bar is rendered into the final image:
On simulator, I see that the full view (including the PDF) is rendered into the final image:
On both device and simulator, the result returned from the [UIView drawViewHierarchyInRect:afterScreenUpdates:] call is YES.
I'm capturing these screenshots by setting a breakpoint in the code after UIGraphicsEndImageContext(), choosing the image variable in the debug inspector and pressing the "eye" button but note that these are the results I see when running the code in release too.
I have a couple of questions about these results:
Is it expected that QLPreviewController contents (for PDF at least) don't render in the snaphot image on device
Is it expected that QLPreviewController contents (for PDF at least) do render in the snaphot image on simulator
If this behaviour is expected, is there a way of telling which UIView subclasses are "compatible" with the [UIView drawViewHierarchyInRect:afterScreenUpdates:] API and which UIView subclasses won't render their contents in a snapshot?
Thanks for your help in advance,
Ceri
Post not yet marked as solved
We create reality files on Reality Composer on an iPad Pro 2020 (iPad OS 14.8.1).
They can be opened in AR correctly on iOS 14 devices (our iPad & an iPhone 12 Pro).
However on our iOS 15 iPhone X (iOS 15.2.1), AR Quick Look opens but it displays an error message saying “the object cannot be opened”.
I have tried creating a .reality file on Reality Composer directly on my iOS 15 iPhone X (I just exported the template project for horizontal plane tracking), but the export fails to open on the very same device.
However, the file generated on my iOS 15 iPhone X opens fine on my iPad Pro (iPad OS 14.8.1).
USDZ files work fine on iOS though.
Any lead on how to solve this issue?
Post not yet marked as solved
i,m loading pdf in QLPreviewController, pdf has more than 5 pages, im editing pdf one by one till last page and giving print, in print preview screen its not showing proper edited pdf. seems like first two edited page showing. remaining pages are seems like not edited screen
Post not yet marked as solved
Hi,
When using FileProvider, files visualised in Finder does not trigger a download to visualise or audition the file when hitting space (for quicklook)
Can this be configured somehow?
Post not yet marked as solved
I have an NSImageView-based preview appex plugin for macOS in Objective-C. It's based off the sample template that you can add to an app.
It completely makes it through preparePreviewOfFileAtURL, and I stuff pixel data into a CGImage, then that into an NSImage and that onto an NSImageView that is self.view from the storyboard. I can see all data via qlManager -p, since it will print error messages for caveman debugging. I have no idea and there are no docs on using the "Quick Look Simulator" which appears to do nothing.
I set background color of the NSImageView.layer to red, and it shows up red. So I know the NSImageView is visible, just not the image that it also points to.
About 30% of the time, only the red shows up, and the other 70% of the time nothing shows up in preview just smokey blurred version of the icons underneath in Finder.
My only recourse, is to disable the extension list for the preview appex, and let the thumbnailer appex provide the preview. There are no samples of this for macOS, and the default templates aren't a working version of this either.
Post not yet marked as solved
For local files, it's an amazing feature to be able to space-bar open a PDF document and quickly swipe through the contents of it.
How can I enable this for pdf files stored on a volume exposed by FileProvider?
Thanks David
Post not yet marked as solved
Somehow my QuickLookUIService Preview Sierra 10.12 has stopped working and cannot preview audio files neither with video files ...
I have tried to Force Quit in activity Monitor but the file QuickLookUIService doesn´t appear to be in the system so it is not in the Library Folder Quick Look.
Any solution for this?
Thanks!
Post not yet marked as solved
I've got an app with a quicklook generator bundled within it.
The app opens port 42222 for localhost queries.
The quicklook generator fails to connect to the socket.
The log shows these 2 sandbox errors:
Sandbox: 1 duplicate report for java deny(1) file-read-data /private/etc/hosts
Sandbox: ExternalQuickLoo(1253) deny(1) network-outbound*:42222
... which is weird because the app isn't sandboxed:
% codesign -d --entitlements :- /Applications/Test.app
Executable=/Applications/Test.app/Contents/MacOS/Test
The same code functions correctly when executed from a separate app running on the same machine (rather than from the generator).
Any idea why the quicklook generator isn't able to connect to a localhost socket?
... or why sandbox rules are being applied to a non-sandbox app?
Post not yet marked as solved
Hi Community,
I am using Googles Model Viewer (https://modelviewer.dev/) to display 3D-models in the webbrowser. It also enables the user to open the 3D-models in AR-Applications. On iOS, Quick Look will be opened.
It all works fine and well, except when the web space directory is protected by HTTP Basic Auth (protection of the usdz-model-files is a requirement of my client).
As I understand it, Quick Look looses (or does not have any access to) the HTTP session, and therefore is not authenticated when trying to access the usdz-file on the protected web space directory.
The error message in Quick Look is: "Object requires a newer version of iOS".
Is there any way to open an usdz-file that is on a protected web space directory with Quick Look?
Post not yet marked as solved
Hi! I'm new to Apple developing. Could anyone please tell me how to configure a quick look for my document? Also, how to print my document? Thanks🙏
Our app uses QLPreviewController for previewing files. We've recently discovered that previewing AR (reality) files bypasses accessing the device’s camera without the app first requesting if it can use the device’s camera.
Post not yet marked as solved
How to initiate the quicklook in object mode directly (but you can still switch the slider to AR if you like)?
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
let url = Bundle.main.url(forResource: modelsName[selectedModelIndex], withExtension: "usdz")!
return url as QLPreviewItem
Post not yet marked as solved
The WWDC '17 session for core spotlight said QLSupportsSearchableItems is only for "shoebox" apps and explicitly said it's not for document-based apps, but when trying to upload my UIDocumentBrowserViewController-based MacCatalyst app with a quicklook extension (the view-controller-based quicklook preview works great) to App Store connect, it prevents me from uploading without a QLSupportsSearchableItems keys in my quicklook extension's info.plist, and it prevents me from uploading it if the value is false (0).
Is the app store validator correct, and I must have this key and it must be true?
Do I have to implement func preparePreviewOfSearchableItem(identifier, and if so, can I just have it always fail?
If it can't fail, how would I get a sandbox url for the document from a searchable item identifier?
Also, Xcode won't find anything when I search for QLSupportsSearchableItems, which is not so much ironic as it is face-palmy.
Post not yet marked as solved
How do I overlay an annotation/detail popup to AR Models in QuickLook? This was done with the WWDC Trading Cards AR Model. I haven't been able to find any other info on this. Here is an article which has an image of what I am referring to
https://vrscout.com/news/apple-shows-off-ar-trading-cards-ahead-of-wwdc-2022/
Post not yet marked as solved
I want to add a small banner to my AR model when viewed in QuickLook in my app. I haven't been able to find much information on this however I found this resource where it explains how to add it when you have a model on a website. My model is locally stored on the device so how would I add this?
Post not yet marked as solved
Hi,
I looked through whole internet searching for some way to compress USDZ files for AR (Quick Look). I see there is some old blog post about Draco compression for USD - https://opensource.googleblog.com/2019/11/google-and-pixar-add-draco-compression.html .
Does anyone know if Quick Look supports USDZ files with Draco compression? I haven't found any mention of it in the official documentations :/ Or maybe there is some other way?
I’m trying to use QuickLook AR to show a 3D ThreeJS model in AR on iOS. Exporting my 3D object to USDZ using the Three USDZ exporter works fine. When opening the USDZ file in XCode everything seems to be fine.
Though, when opening the USDZ in QuickLook AR, the 3D model is flying above the ground, on my camera’s Y level. The camera PoV is positioned exactly in the middle of the X and Z axis of the 3D model and at the bottom of the Y level.
I have another problem with opening the USDZ in QuickLook AR, which is; When opening the USDZ in QuickLook AR, the model is invisible at first. Then when I scale the model down to < 10%, the model becomes visible, though it does not scale in size at all.
Also, the “Model” tab in QuickLook does not even show the 3D model. When switching between the “Model” and “AR” tabs, the model flies by really quick.
For reference, I’ve added my USDZ model below.
What I’m trying to accomplish is to position the 3D model in front of me, and for the 3D model to acknowledge the world shown by the camera. The 3D model should stick to walls, or at least the floor to begin with.
Button click code:
newScene.add(sceneRef)
const pivot = new THREE.Object3D()
newScene.add(pivot)
pivot.add(sceneRef)
// position the object on the pivot, so that it appears 5 meters
// in front of the user.
pivot.position.z = -50
const yaxis = new THREE.Vector3(0, 1, 0)
const zaxis = new THREE.Vector3(0, 0, 1)
const direction = zaxis.clone()
// Apply the camera's quaternion onto the unit vector of one of the axes
// of our desired rotation plane (the z axis of the xz plane, in this case).
direction.applyQuaternion(cameraRef.quaternion)
// Project the direction vector onto the y axis to get the y component
// of the direction.
const ycomponent = yaxis
.clone()
.multiplyScalar(direction.dot(yaxis))
// Subtract the y component from the direction vector so that we are
// left with the x and z components.
direction.sub(ycomponent)
// Normalize the direction into a unit vector again.
direction.normalize()
// Set the pivot's quaternion to the rotation required to get from the z axis
// to the xz component of the camera's direction.
pivot.quaternion.setFromUnitVectors(zaxis, direction)
// Finally, set the pivot's position as well, so that it follows the camera.
newScene.getWorldPosition(cameraRef.position)
newScene.updateMatrixWorld(true)
iosExporter.parse(newScene).then((result) => {
saveUSDZString(result, 'scene.usdz')
})
saveUSDZString function:
function saveString(text: any, filename: any) {
save(new Blob([text], { type: 'application/json' }), filename)
}
save function:
function save(blob: any, filename: any) {
link.href = URL.createObjectURL(blob)
link.download = filename
link.rel = 'ar'
let img = document.createElement('img')
img.alt = 'hi'
img.src = 'https://google.com/img'
link.appendChild(img)
link.click()
}
USDZ Model: https://wetransfer.com/downloads/2d2d2e840f9f964e036cd6077094c33220220630095321/a7f94b9f2bc730fead9107bf133e175220220630095338/193b81?utm_campaign=WT_email_tracking&utm_content=general&utm_medium=download_button&utm_source=notify_recipient_email