Following up on your later messages, as I was working on my response through most of today. And for my ~100,000 files Documents folder, it took about 6 seconds to run. Here's how long my enumeratorAtURL code took: enumeratorAtURL (empty)-> File Count: 100000, Time: 1.344492s And this one - although lengthy - took about 0.45 sec to do the same job (even a little better). Yes. What you're doing here is replacing a single serial run through the directory hiearchy with a series of parallel interation with inside the secondary hiearchy. Particularly on APFS, this can speed things up as there isn't very much I/O contention (the catalog data is all in memory) or lock contention (APFS is better than at this than many file systems). FYI, contentsOfDirectoryAtURL is a wrapper around enumeratorAtURL, iterating the directory hiearchy and accumulating into a target array. In your particular case, you inadvertendly ended up jumping from the slowest enumerator (enumeratorAtPath) to the fastest (enumeratorAtURL + a proper
missing package product
42,925 results found
Post
Replies
Boosts
Views
Activity
Hello I've noticed that this product, heavily promoted on the ASC forums for many years, is no longer available from the Apple App Store. Can anyone tell me the reason why the product is no longer supported? Friends have asked me if it is 'safe' to use. Is it? Note to moderator: If I'm asking in the wrong places, please redirect my question. Thank you.
These are the Developer Forums, where developers of apps for Apple's platforms ask each other for hints and tips on coding. Your question is more of a product support one, so I'd suggest you ask it over at the Apple Support Forums. Thanks.
I have had a suspicious crash in my app for a long time. I'm 95% using SwiftUI and have to use only one UIKit view that I embed into SwiftUI in my app. I don't understand why my app crashes when the CALayer position is set to Nan because I don't reference layers at all, including that UIKit view. Any ideas why it may happen and how can I reproduce it? Fatal Exception: CALayerInvalidGeometry CALayer position contains NaN: [nan nan]. Layer: <CAShapeLayer:0x303cd9600; position = CGPoint (0 0); bounds = CGRect (0 0; 0 0); delegate = _NoAnimationDelegate; allowsGroupOpacity = YES; anchorPoint = CGPoint (0 0); > Fatal Exception: CALayerInvalidGeometry 0 CoreFoundation 0x83f20 __exceptionPreprocess 1 libobjc.A.dylib 0x16018 objc_exception_throw 2 CoreFoundation 0x1826dc -[NSException initWithCoder:] 3 QuartzCore 0x7b28 CA::Layer::set_position(CA::Vec2<double> const&, bool) 4 QuartzCore 0x7a58 -[CALayer setPosition:] 5 SwiftUI 0x1a131d4 objectdestroy.10Tm 6 SwiftUI 0x1a12ab4 objectdestroy.10Tm 7 Swift
Hello, I am facing this issue too, to be precise when I use VNCoreMLModel. However I have seen this happens only when the app is ran from Xcode. If you run it in production nothing strange happens. Could be some sort of debugging bug?
I'm encountering run issues when running my os_signpost instrument that spits out the following kind of messages:Data stream: 5 log/signpost messages lost due to high rates in live mode recording. Try windowed recording mode.I am using deferred recording already (enabled from the checkbox in the recording preferences), but I can't find a setting for a windowed recording mode anywhere.Would appreciate a point in the right direction!
I'm working on a multi-platform app (macOS and visionOS for now). In these early stages it’s easier to target the Mac, but I started with a visionOS project. One of the things the template creates is a RealityKitContent package dependency. I can target macOS 14.5 in Xcode, but when it goes to build the RealiityKitContent, I get this error: error: Building for 'macosx', but '14.0' must be >= '15.0' [macosx] info: realitytool [/Applications/Xcode-beta.app/Contents/Developer/usr/bin/realitytool compile --platform macosx --deployment-target 14.0 … Unfortunately, I'm unwilling to update this machine to macOS 15, as it's too risky. Running macOS 15 in a VM is not possible (Apple Silicon). This strikes me as a bug, or severe shortcoming, of realitytool. This was introduced with visionOS 1.0, and should be able to target macOS < 15. It's not really reasonable to use Xcode 15, since soon enough Apple will require I build with Xcode 16 for submission to the App Store. Is this a bug, or intentional?
I'm trying CoreSpotlight on the 18b1 seed on iOS and after submitting my query, I'm getting multiple errors about what looks like missing models: [Model loading] model loading failed with err -1000 for model path /Users/hunter/Library/Developer/CoreSimulator/Devices/0AF4F46E-5510-4458-B61C-F8A153155809/data/Containers/Data/Application/1D8580C0-AC80-4949-9FDA-31DB463BDA5C/Library/Spotlight/Resources_V3/Default/models/spotlight_l2.mlmodelc and directives path /Users/hunter/Library/Developer/CoreSimulator/Devices/0AF4F46E-5510-4458-B61C-F8A153155809/data/Containers/Data/Application/1D8580C0-AC80-4949-9FDA-31DB463BDA5C/Library/Spotlight/Resources_V3/Default/directives/directives_l2.mdplist I am calling CSUserQuery.prepare() but that doesn't seem to make a difference. Is there more to this than what is on this page? https://developer.apple.com/documentation/corespotlight/building-a-search-interface-for-your-app?changes=latest_minor
There are many reasons for a problem like this to appear. The two most common reasons we see during development are: building over and over from Xcode, your tokens may have changed and you are not aware of it. Try cleaning up. Delete the app from the device, and install a fresh build and test using the new tokens if you are switching between debug and release builds, make sure that you are using the correct tokens and the correct APNs endpoints. development and production tokens are different and can only be used with the corresponding environments. If these simple issues are not it, then you will need to look at any errors you are receiving, and if you cannot solve the issue, supply some detailed information about failing notifications. You can read more about the diagnostic info needed at If you need assistance debugging your push notification issues Argun Tekant / DTS Engineer / Core Technologies
I have an iPhone 14 running iOS 16.1 and my series 5 watch running watchOS 9.1. I was able to turn on Developer Mode on the phone by going to Settings--> Privacy & Security --> Developer Mode. On the watch however (I'm doing this directly on the watch and not on the watch app on the phone) once I'm in Privacy & Security, there is no option to select Developer Mode. How do I get my watch in Developer Mode in order to get a successful build in xCode?
This is a bit off-topic, but hoping one of you might reply. I just learned about the Network framework. In an introductory WWDC talk, they show a live-streaming video example, but the code isn’t available (sadly). There’s a reference to “breaking the frame up into blocks” because of delivery over UDP: I assume this is because of message lengths? At any rate, if someone can give me a quick idea of the strategy of sending video frames from device to device, over UDP (which must assume some things can get lost), I’d greatly appreciate it. I assume UDP has message length constraints in Network, but I don‘t see them mentioned. Surely I can’t just send an entire 2K jpeg image (i.e. 1920x1080 pixels) in one UDP message. Or can I?
I have built my app with no errors and uploaded to testflight successfully. But when I try to install my app on my iphone, it throws the following error. Could not install {AppName} the requested app is not available or doesn't exist. And also received an email saying ITMS-90078: Missing Push Notification Entitlement- Your app appears to register with the Apple Push Notification service, but the app signature's entitlements do not include the 'aps-environment' entitlement. ... But I already have the required provisioning profile and have 'aps-environment' entitlement manually. What is the reason and please let me know potential solutions.
I am unable to receive push notifications through my Apple Wallet pass. Based on the documents I have read, I have set up my APN provider and should be sending all the necessary information. Furthermore, my logs show that I am successfully sending my notification to APN, so it is confusing that no notification appears on my phone. Below are my code snippets: apnProvider.js: const path = require('path'); const options = { cert: path.resolve(__dirname, '../certificates/signerCert.pem'), key: path.resolve(__dirname, '../certificates/signerKey.pem'), passphrase: 'test', production: true }; const apnProvider = new apn.Provider(options); module.exports = apnProvider; APN provider using an auth key (currently not being used but was previously and provided the same success message): token: { key: path.resolve(__dirname, '../certificates/AuthKey_627HR2YX2S.p8'), keyId: 627HR2YX2S, teamId: 72J45J9PH3 }, production: true }); API Rout: const { userId } = req.body; console.log(`Received POST request on /
My tests fail with Restarting after unexpected exit, crash, or test timeout. This is what I have so far established: Does only happen when building with xcodebuild (14.3) but not when building with Xcode (14.3) Tests that are failing are in a .testTarget within a local swift package and part of a test plan together with tests that are in subprojects there are no crashlogs, so likely these tests are not crashing with Xcode 14.2 there are no problems whatsoever (xcodebuild or Xcode) Any ideas on how to debug this any further? Best Roddi
This no longer seems to work. When I create a new standalone app, I only get a Watch App target and when trying to archive this, I only get options for exporting the archive to file. The bundlid ends with .watchkitapp, the app records are online at AppStoreConnect, there is even a screenshot in the watchOS portion of the app record. Stiil: What am I missing?