I am wondering what the best practice for attributing app installs to Points of Purchase.
For example, I will have a map app that a visitor center may display a poster with a bar code to install the app. I need to be able to attribute every install (purchase) made with that barcode to the visitor center.
Can an App Store URL include a unique identifier or referrer that will be associated with an app purchase visible somewhere like with App Store Analytics?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to create a MKRoute from a list of MKMapPoints or coordinates. But apparently MKRoute can only be generated from a MKDirections request from Apple's servers.
The primary use of my app will be activities (eg hiking) in the back country where (1) a network connection likely won't be available and (2) there likely will not be a trail in Apple's map network.
For example I want to provide navigation for following a recorded GPS track or my only MKPolyLines.
Note that I am required to use MapKit (3rd party map SDKs are not an option for a number of reasons). It feels like a huge missed opportunity if MapKit doesn't allow Routes to be created from a predetermined list of coordinates.
Does anyone know of any solutions for this problem either somehow creating a MKRoute from a list of coordinates or a 3rd party library? I've searched but haven't had any luck finding a solution. It seems like something like this must exist so I thought I'd ask.
For the "Required device capabilities" in my info.plist I have:
iPhone Performance Gaming Tier
iPad Minimum Performance M1
But a beta test just informed me they cannot install on iPhone 16 Pro Max due to "incompatible hardware"
I need to limit to iPhone 15 or newer and M1 or newer. I read that iPhone Performance Gaming Tier also limits iPads to M1 here:
https://developer.apple.com/forums/thread/737946
Perhaps I should only use "iPhone Performance Gaming Tier" and by using "iPad Minimum Performance M1" it is not allowing it to be installed on an iPhone?
It would be very nice if I could see what devices are supported by the current settings.
Topic:
App & System Services
SubTopic:
Hardware
I don't know why, but for my MacCatalyst target, I have to make my view controller Y orgin 36 and the subtract the view height by 36 points, or the view is clipped.
The following works in my main UIViewController, but feels super hacky. I'd feel better if I understood the issue and addressed it properly.
override func viewWillLayoutSubviews() {
super.viewWillLayoutSubviews()
#if targetEnvironment(macCatalyst)
if view.frame.origin.y < 1 {
let f = UIApplication.shared.sceneBounds
let newFrame = CGRect(x: 0.0, y: 36, width: f.size.width, height: f.size.height - 36)
self.view.frame = newFrame
}
#endif
}
My guess is it starts the view under the title bar, but I have these set in the view controller:
self.extendedLayoutIncludesOpaqueBars = false
self.edgesForExtendedLayout = []
If I change MKMapView's .preferredConfiguration property from .realistic to .flat, or .mapType from .hybridFlyover to .hybrid, subsequent scrolling causes a crash:
-[MTLDebugRenderCommandEncoder validateDrawIndexedPrimitives:indexCount:indexType:indexBuffer:indexBufferOffset:instanceCount:function:]:6179: failed assertion `Draw Indexed Primitives Validation
indexBufferOffset(0) + (indexCount(864) * 2) must be <= [indexBuffer length] (12).
For example, changing:
mapView.preferredConfiguration = MKHybridMapConfiguration(elevationStyle: .realistic)
to:
mapView.preferredConfiguration = MKHybridMapConfiguration(elevationStyle: .flat)
Then, scroll the map view, and it will crash. It is OK the other way around.
Or change:
self.mapView.mapType = .hybridFlyover
to:
self.mapView.mapType = .hybrid
I've tried everything I can think of, including calling functions like these after the change:
mapView.setNeedsDisplay()
mapView.setRegion(self.mapView.region, animated: false)
.mapType and .preferredConfiguration are settable properties, so they should be possible to change. I could create a new MKMapview, but I'd have to perfectly recreate the state which is not trivial and far from ideal.
I'm just trying to work around the issue FB14553276 so my map tiles don't show tiling seems in 2D which is a new issue introduced with iOS 18. This potential workaround still shows the seems in 3D, but is better than always showing seems. Seems like whatever I do, I just can't defeat MapKit bugs and puts me in an impossible situation. :(
I've submitted Feedback this issue: FB16153802
It seems like others are experiencing the issue:
https://forums.developer.apple.com/forums/thread/730780
Loading tile overlays is slow even when the raster data is locally available on the device running iOS 18.2 and built with Xcode 16.2.
In this video (https://3dtopo.com/superSlowTileLoading.mov) it takes 38 seconds to load tiles readily available on the device. Then, the whole screen flashes when tiles that are already drawn are redrawn, making for a very poor user experience. 38 seconds to load a dozen or so small images (512x512) stored locally on the device is simply unacceptable. I can't release a product like this that I've spent the last 1.5 years building and many years developing the maps themselves. This severe issue is new since I committed to basing my app on MapKit.
Note that this issue does not occur with Apple's base map tiles.
I created a Feedback Assitant case, FB16110803, for this issue.
For the video, I disabled loading any tiles from the network and disabled loading any other data, such as polylines. Essentially all I am doing is loading the tiles stored on the device and returning them, such as:
public func loadTile(at path: MKTileOverlayPath, result: @escaping (Data?, Error?) -> Void) {
fetchData(forKey: key,
failure: {error in result(nil, error)},
success: {data in result(data, nil)})
}
open func fetchData(forKey key: String, failure fail: ((Error?) -> ())? = nil, success succeed: @escaping (Data) -> ()) {
let path = self.path(forKey: key)
do {
let data = try Data(
contentsOf: URL(fileURLWithPath: path),
options: Data.ReadingOptions())
succeed(data)
self.updateDiskAccessDate(atPath: path)
} catch {
if let block = fail {
block(error)
}
}
}
I cannot find the hardware requirements for Image Playground documented anywhere. I'm also not sure if they are identical to devices that support Apple Intelligence.
On the App Store, the only requirement listed for Image Playground is iOS 18.2.
Not knowing the requirements is an issue because I need to be able to clearly state the requirements for the feature in my app description.
Also, I'm sure my mother's current iPad is too old, but I'm not sure what models support it if I were to buy her a new one.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I'm trying to determine the best practice for handling if Image Playground is available but not installed or simply not supported.
If ImagePlaygroundViewController.isAvailable is true, I will just display a button to start an Image Playground session. If it is false, does that mean ImagePlayground is supported but not installed?
If it's supported and not installed, instead of a button to launch it, I want to display something like "Enable Apple Intelligence in Settings" or, better yet, a button that opens the Intelligence settings. Is that possible?
But if it is on a system that doesn't support it, of course, I don't want to instruct the user to enable it. How can I determine if a device cannot install Image Playground?
I read that Apple Intelligence requires iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models, and no mention of the M1 iPad Pro, yet Image Playground runs on my M1 iPad Pro. What are the hardware requirements for Image Playground?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
To my horror, after upgrading to macOS 15 beta (24A5264n), the stable version of Xcode (15.4) refuses to run.
I need to be able compare behavior and produce builds for the App Store.
Is there a stable version of Xcode planned for macOS 15 beta before Xcode 16 is released?
Topic:
Developer Tools & Services
SubTopic:
Xcode
I understand that MapKit automatically sizes the font based on the system Dynamic Type size.
The thing is, the default font size is plenty legible for me system wide except for some type in MapKit, while other type in MapKit is already plenty big.
For instance, the stream names are much harder to read than they should be by default. And if I increase the system Dynamic Type size, then it makes some MapKit text much larger than needed, while the stream text can still be hard to read.
So is there anyway to override or adjust font sizes in MapKit? I'd like to be able to apply percentages to Dynamic Type suggestions. Like for streams, I'd like to scale it somewhere between 133% and 150%.
The smallest Dynamic Type is size is .caption2 at 11 points with default settings. With default Dynamic Type settings, it looks like MapKit is drawing stream text around 7 points.
This post states that attribution is required for MapKit or violates the developer agreement.
Can I provide my own attribution if I am displaying my original map content? It seems bizarre to attribute map sources that are not used.
I need multiple MKTileOverlays with multiple blendModes. Apparently using an overlay with a different blend causes the layer under to use the same blend mode.
For the example below, using a normal blend mode on top of a soft light blend mode causes a normal blend mode to be used instead of soft light. The soft light layer is rendered as expected until the normal layer is displayed starting at zoom level 15.
First, I've subclassed MKTileOverlay to add an overlay type so that the correct renderer is provided per overlay. (I know there is a title, but I prefer this)
enum OverlayType {
case softLight,
normal
}
class TileOverlay: MKTileOverlay {
var type: OverlayType = .normal
}
Then setup layers and renderers in the typical fashion:
var softLightRenderer: MKTileOverlayRenderer!
var normalRenderer: MKTileOverlayRenderer!
private func setupSoftlightRenderer() {
let overlay = TileOverlay(urlTemplate: "http://localhost/softlight/{z}/{x}/{y}")
overlay.type = .softLight
overlay.canReplaceMapContent = false
overlay.minimumZ = 9
overlay.maximumZ = 20
mapView.addOverlay(overlay, level: .aboveLabels)
softLightRenderer = MKTileOverlayRenderer(tileOverlay: overlay)
tileRenderer.blendMode = .softLight
}
private func setupNormalRenderer() {
let overlay = TileOverlay(urlTemplate: "http://localhost/normal/{z}/{x}/{y}")
overlay.type = .normal
overlay.canReplaceMapContent = false
overlay.minimumZ = 15
overlay.maximumZ = 20
mapView.addOverlay(overlay, level: .aboveLabels)
normalRenderer = MKTileOverlayRenderer(tileOverlay: overlay)
normalRenderer.blendMode = .normal
}
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
if let overlay = overlay as? TileOverlay {
switch overlay.type {
case .softLight:
return softLightRenderer
case .normal:
return normalRenderer
}
}
print("Warning: using unhandled overlay renderer")
return blankRenderer
}
override func viewDidLoad() {
...
setupSoftlightRenderer()
setupNormalRenderer()
}
Interestingly, if I put the overlays at two different levels, one .aboveLabels and another .aboveRoads it works as expected. The problem is that limits me to two different overlay blend modes. I really could use more than two. I tried every possible variation of inserting the layers at different indexes and methods, but the only two that seem to work are the .aboveLabels and .aboveRoads.
How can I use more than two different blend modes?
Hello,
I have my largely iOS app running using Mac Catalyst, but I need to limit what Macs will be able to install it from the Mac App Store based on the GPU Family like MTLGPUFamily.mac2. Is that possible?
Or I could limit it to Apple Silicon using the Designed for iPad target, but I would prefer to use Mac Catalyst instead of Designed for iPad. Is it possible to limit Mac Catalyst installs to Apple Silicon Macs?
Side question: what capabilities are supported by MTLGPUFamily.mac2? I can't find it. My main interest is in CoreML inference acceleration.
Thank you.
When I run the performance test on a CoreML model, it shows predictions are 834% faster running on the Neural Engine as it is on the GPU.
It also shows, that 100% of the model can run on the Neural Engine:
GPU only:
But when I set the compute units to all:
let config = MLModelConfiguration()
config.computeUnits = .all
and profile, it shows that the neural engine isn’t used at all. Well, other than loading the model which takes 25 seconds when allowed to use the neural engine versus less than a second when not allowing the neural engine:
The difference in speed is the difference between the app being too slow to even release versus quite reasonable performance. I have a lot of work invested in this, so I am really hoping that I can get it to run on the Neural Engine.
Why isn't it actually running on the Neural Engine when it shows that it is supported and I have the compute unit set to run on the Neural Engine?
Topic:
Machine Learning & AI
SubTopic:
General
Tags:
ML Compute
Machine Learning
Performance
Core ML
My app is capable of writing multi-gigabyte images. I need a way to allow the user to do something with the full-resolution images (such as AirDrop). It does not appear possible to write the image to Photos since there may not be enough memory to read the whole image in RAM and as far as I can tell PhotoKit allows readonly access.
With UIActivityViewController I am able to copy the large images with a NSFileCoordinator, but only if I use an extension of .zip or .bin for the destination URL. The file is copied, but macOS thinks they are archives and not an image. After the file has been copied, changing the extension to .jpg on the Mac allows the copied image to be used, but that is less than an ideal user experience.
Zipping the image (that is already compressed) is not a viable solution since the whole image would need to be read to zip it, which may exceed the amount of memory available. Not to mention it is a total waste of time/power consumption.
How can I allow a user to copy multi-gigabyte images from my app, and ideally with UIActivityViewController?
Here is the code that allows the image to be copied. If I don't use .zip or .bin for the destination URL, the image isn't copied, nothing happens, no errors reported.
func copyImage() {
let fm = FileManager.default
var archiveUrl: URL?
var error: NSError?
let imageProductURL = TDTDeviceUtilites.getDocumentUrl(
forFileName: "hugeImage.jpg")
let coordinator = NSFileCoordinator()
coordinator.coordinate(readingItemAt: imageProductURL,
options: [.forUploading],
error: &error) { (zipUrl) in
let tmpUrl = try! fm.url(
for: .itemReplacementDirectory,
in: .userDomainMask,
appropriateFor: zipUrl,
create: true
).appendingPathComponent("hugeImage.jpg.zip")
try! fm.moveItem(at: zipUrl, to: tmpUrl)
archiveUrl = tmpUrl
}
if let url = archiveUrl {
let avc = UIActivityViewController(activityItems: [url],
applicationActivities: nil)
present(avc, animated: true)
} else {
print(error)
}
}