The documentation for NSCollectionViewItem refer a Collection View Programming Guide for macOS.Legacy Item SupportFor apps built before OS X 10.11, you created a template item and assigned it to the itemPrototype property of your collection view. To create new instances of the item, you called the collection view’s newItemForRepresentedObject: method. For more information about how to support older collection view configurations, see Collection View Programming Guide for macOS.Where can I find this Collection View Programming Guide for macOS?
Search results for
Apple Maps Guides
149,818 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm wondering if there is a dummy's guide to code signing. This page seems to have a step by step process but doesn't tell how to actually do any of the steps.https://medium.com/ios-os-x-development/ios-code-signing-provisioning-in-a-nutshell-d5b247760befFor example.... Xcode will be installed and the Intermediate Certificate will be pushed into the KeychainHow? Where do I install xcode from (I actually figured that out)?How or what do I need to do to get the intermediat cert pushed in to the keychain. What is a keychain? Where do I find it? Certificate Signing Request (CSR) will be createdHow is the CSR created? Do I need to do something or is this automagic once I install xcode or do something else.It surpises me how bad the documentation for this is. I would think at the very least since I just spent $100 to be in the developer program Apple would have this. If they do....its well hidden.
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
Signing Certificates
Provisioning Profiles
I made a map positioning app, following is my design: when user logined, his position will be showed on the map automatically, and his location is in the center of the map(which means the blue point is in the center of map). But when i logined in my iphone5s, the blue point always isn't in the center of map, there's a offset in my phone. But the situation is normal in others iphone6, when he logined, the blue point is in the center of map(his position shows in the center of map). Why it has an offset in my phone?
When you refer to NFC door what do you mean with that? You'd need an Apple Access certified NFC reader at the door - here is a guide on the topic: https://www.getkisi.com/guides/apple-wallet-access
Topic:
App & System Services
SubTopic:
Hardware
Tags:
If the app uses the significant location change service, it will be relaunched on significant location change even after force quit (iOS 8 and later), and I believe if it has Always permission it can restart continuous monitoring at that point using the standard APIs. Haven't done that myself to confirm.Everything you are allowed to do is described in the Location & Maps Programming Guide.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
I’m going through Develop in Swift Fundamentals and I’m stuck at a problem where it says “Learn about the map method, and use it in place of the loop that converts the array of characters to an array of strings in updateUI().” Here’s the code: var letters = [String]() for letter in currentGame.formattedWord { letters.append(String(letter)) } Thanks in advance.
Hello,is there any way to select which input channels shall be mapped to the output with a AVAudioMixerNode?I have a inputNode with 4 channels and want to use 2 specific channel from it within my processing chain.To get an idea it looks like this:... var mixer = AVAudioMixerNode() var inputFormat = audioEngine.inputNode.inputFormat(forBus: 0) // This one has 4 channels var outputFormat = AVAudioFormat(commonFormat: inputFormat.commonFormat, sampleRate: inputFormat.sampleRate, channels: 2, interleaved: inputFormat.isInterleaved)! audioEngine.attach(mixer) audioEngine.attach(someOtherNode) audioEngine.connect(inputNode, mixer, inputFormat) audioEngine.connect(mixer, someOtherNode, outputFormat) ...Basically this works as expected but I have no choice about the channel mapping from input to output.The mixer automatically maps channel 0 and 1 from the four input channels to the two output channels.How I can I tell the mixer that it should map channel 2 and 3 to the output chann
Hello all I am quite a newbie to this, I have been playing with programming over the years on and off, but I have decided now to have a realy go at it and actually make it of some use.I have an idea in my head, and I would like my app to allow me to take an image and then tag it on a map, what should I be focusing on to learn this? I am not looking for answers in here as htat really wont teach me anything, I am looking for resources that can grow my skill.I am starting with El Capitan (I am along time mac user) and XCode 7, with my dev iPhone on iOS9Cheers,Damien K
Hi.I'm having a hard time trying to make GC work on my tvOS game.I have added GC support to my previous games on iOS and OSX but it seems my old classes are not working anymore on tvOS and I can't find any reference to how GC is meant to be used in Apple TV.Do you know of any tvOS oriented guide or tutorial?
I have written a program in SwiftUI to display information from a database that includes the street address, but not the latitude and longitude. I've added a button that passes the street address to geocodeAddressString to get the lat & long and updates the map, but every time I click the button, even though the map appears to be displayed properly, I get a message in Xcode complaining Modifying state during view update... My view has a @State variable that holds a MKCoordinateRegion() for the map to display. The closure on completion of the geocode lookup updates that @State. @State private var mapRegion = MKCoordinateRegion() private func updateAddress() { let index = data.zipCode.index(data.zipCode.startIndex, offsetBy: 5) geocoder.geocodeAddressString((data.streetAddress) (data.zipCode[..
Does localized data used for ARGeoTracking also include data submitted within Indoor Mapping (IMDF)?
I have a collection of images on a UICollectionView . For each image I get dynamic number of hotspots (x,y coordinates) from a server and I create and place dynamic UIButtons on those hotspots as subviews of the main UIImageView. I really need to know how to setup Focus guide for these dynamically created buttons on specific x,y coords on the UIImageView ?
Hi,I'm confused about how the Safe layout guide works.I was told that I should use the safe layout guide to align UI elements to the top and bottom of the safe area, instead of aligning them to the top and bottom of the root view, and that this would allow my apps to avoid the unsafe areas on the iPhone X, as well as the default navigation and status bars.When I did this and tested my apps on iOS 11, it all worked as advertised.However, when I tested my apps on iOS 10, it did not work at all. Items that are supposed to be aligned to the top of the safe area instead extend underneath the navigation bar or status bar.Obviously, a lot of people still run iOS 10. I can't release an app that only works on iOS 11, at least not until the majority of users upgrade, which won't happen for at least a year.In the meantime, how do I create an app that works properly on both iOS 11 and earlier versions?Thanks,Frank
What approach does apple use in it's IOS Maps apps for show the annotation details?For example to reproduce this in your own app would it be via creation of a View Controller in Storyboard and then use of the popover method, however in this case I would normally see on iPhone it taking up the whole screen, so it seems it's a different approach or you have to do some special customisation above & beyond the base popover approach?https://s29.postimg.org/4hkp7qexj/forum_1.png
I know that this is slightly unorthodox, as is is less of a community question and more of a hope that an apple engineer might see this. I would use a Support ticket however they are not allowed for Pre Release software, therefore I am here.I currently have a Watch app on the App Store, The app is an activity tracker for Cycling & Running.I have a feature in the app that allows the user to see a map of there surrounding area.The map is created using a MKMapSnappshotter to create a UIImage onto which I add some overlays showing the users position and a poly line of where they have been.The issue is that now Apple Watch apps are native, the MapKit framework is off limits (With good reason). I therefore moved the snapshot code into the main app and simply requested the Snapshot using the Watch Connectivity Framework which works... but only if the iOS App is running in the foreground.As stated in the documentation the MKMapSnapshotter will only return the image if the app i