i have a code only static library framework and added PrivacyInfo.xcprivacy file inside.
because there are no resources required in runtime,
app using that framework can build without embedding.
finally there are no PrivacyInfo.xcprivacy file in app bundle.
is this correct intended operation?
some steps to propagate and merge static framework's privacy manifest to app's privacy manifest not needed?
General
RSS for tagPrioritize user privacy and data security in your app. Discuss best practices for data handling, user consent, and security measures to protect user information.
Post
Replies
Boosts
Views
Activity
Hello,
What are the guidelines for mergeable libraires regarding privacy data ?
In particular where do we put the PrivacyInfo.xcprivacy file in this situation, so as Apple process can scan it when we upload the package of the app.
Thank you.
I have a question about the privacy manifest including the process, that is
Do I need to declare a privacy manifest file for the SDKs that Apple is not listed in their list?
Let's take an example, I have two SDK's like SDK1, SDK2 used in my app and both the SDK's used the "NSUserDefaults" privacy part and both the SDK's are not listed in the Apple list and also both SDK's did not have their own privacy manifest file. Now, the questions are,
Do I need to include Privacy Manifest file to both the SDK's?
OR Can I add one Privacy Manifest file in the app-specific then Xcode will combine OR use thisPprivacy Manifest file for the SDK's too?
Thanks!
Using the DeviceActivity framework we are able to display data based on a user's screentime and device usage. With the DeviceActivityFilter property, you can specify the date interval to collect data between.
In testing, it seems that data only becomes accessible once the extension has been installed (so the extension isn't reading the screentime data already collected on device). However, once installed, I'm curious how far back you can query data from in the data interval?
Opal which uses the Screentime API appears to have a lifetime Screentime metric, so hypothetically it should possible to query data as far back as collection starts. Unless they are getting around the sandbox environment and storing the data somehow.
Side note on Opal -- They seem to have a community average of Screentime among people in the same age group. Does anyone know how they are collecting the data for this average? Is it actually using live Screentime data or just aggregating data from other studies?
There is frequently a delay of a few seconds before a DeviceActivityReport renders its view generated from the DeviceActivityReportExtension. It will also sometimes flash with zero data before hydrating with the real activity data (tested with extension code taken directly from XCode boilerplate)
Is there a way to be notified when the DeviceActivityReport renders successfully or is still processing, i.e. so a loading indicator can be presented while the extension runs?
Thanks!
In the new macOS and iOS updates (14.4 and 17.4 respectively), something has changed in regards to passkey creation:
Any passkey created from Safari doesn't have any transports + the authenticatorAttachment is always set to platform, irrespective of whether a cross-platform authentication method is utilized, such as a hardware security key.
All passkeys saved in iCloud Keychain created from any browser have an authenticatorAttachment always set to platform + empty authenticator transports.
authenticatorAttachment always set to platform
According to the WebAuthn specification (Section 5.4.5), the authenticatorAttachment descriptor plays a crucial role in guiding the client (browser or platform) to create or use an authenticator of a specific type. The options are platform for a built-in authenticator or cross-platform for a roaming authenticator.
Some relying parties mandate a cross-platform method for the first passkey or as second authentication factor. This is to ensure users do not find themselves locked out when they try to sign in from a device that doesn't have access to the non-roaming webauthn credential. Unfortunately, the current implementation in Sonoma 14.4 forces the authenticatorAttachment to platform, thus preventing the creation of passkeys that comply with such policies on websites.
For comparison, browsers like Chrome correctly return a cross-platform authenticatorAttachment when a hardware security key is used, and the same used to happen on previous macOS and iOS versions from Safari.
Authenticator transports missing
The absence of transport data (WebAuthn Section 5.8.4) for all passkeys created via Safari and iCloud Keychain passkeys created from all browsers further complicates the scenario. The transport hint is crucial for informing relying parties about the preferred transport method for the authenticator, be it USB, NFC, BLE, HYBRID or internal. This omission could lead to inefficiencies and a diminished user experience, as the system cannot optimize the authentication process based on the authenticators available to the user.
These issues jeopardize the utility and adoption of passkeys across various platforms and browsers, a primary goal of WebAuthn and FIDO2 for widespread secure authentication practices. What is the rationale behind this choice and is there any workaround to be considered?
Thanks for all the help and clarification!
I'm noticing a trend in 'foreign' home security products that they want to combination of QR code scanning, and home router connections for 'Easy Setups'.
The iOS apps that have to be used with these products require the user to enter their home WiFi password directly into the app. Such apps also commonly request location data.
If unencrypted router passwords, and the Location data of the router are being captured and sent back to the manufacturer, this would be very very bad.
Of the few things I've put on the App Store, Apple went through my code with a fine tooth comb looking for things that went against their protocols and had to do multiple revisions to bring them in line. Although frustrating at the time, I was pleased to know this kind of screening happened.
I've heard Apple won't allow apps to do key logging/capture. Fantastic.
Is the the handling of our home network credentials also heavily scrutinised before thing are allowed on the Apple Store?
This page indicates https://support.apple.com/en-in/guide/deployment/dep0a2cb7686/web that some usage of fdesetup command line tool is deprecated such as turning on FV using username/password.
However, I don't see any proper information about which options from the fdesetup tool are deprecated and which are still valid?
Any pointers for that?
Thanks,
N
Let's say I have an iOS app on the app store. Anyone can download and use it, but I would like to restrict the app from granting access to certain features to a select set of people I can personally vouch for. So, for example, to get access, the app send email to me, you have to convince me I know you, and if you do, I send you back some kind of token string which you can enter into the app.
However, I'd like for that token to not be shareable, and to be locked to that device.
Is there any kind of persistent ID associated with a device that I can use to tie the token I grant to that persistent ID?
Or can someone suggest a way that once I trust a user, I can give them a token which will cannot be shared to anyone else?
Also, does anyone know if restricting access to app features in this way is any kind of issue with regards to the app review process? The app itself is free, and there are no in-app purchases. I simply don't want certain features of the app (which end up sending push notifications) to get abused.
Description says this event will be raised when "An identifier for a process that notifies endpoint security that it is updating a file." What does this mean ?
Similarly when will ES_EVENT_TYPE_NOTIFY_FILE_PROVIDER_MATERIALIZE event be raised ?
Do these events get raised if any cloud provider sync app like Google Drive/Dropbox/OneDrive that usages fileprovider framework to sync the data ?
In my endpoint secutiry app, I have registered for these events but i didnt receive any event
*i do receive other endpoint secutiry events like ES_EVENT_TYPE_NOTIFY_CLONE etc.
Hi, I'm trying to achieve the following OpenSSL workflow in Swift.
I have this intermediate certificate from Let's encrypt and I want to extract the public key from it and then hash it with SHA-256 and finally encide it in base64.
The OpenSSL commands that achieve this look like this:
openssl x509 -in isrgrootx1.pem -pubkey -noout > publickey.pem
openssl rsa -pubin -in publickey.pem -outform der | openssl dgst -sha256 -binary | openssl enc -base64
I've tried Security, CommonCrypto, CryptoKit frameworks with no success. I was able to get the public key out of the certificate but its PEM representation seems to slightly differ from what I get with OpenSSL. At the beginning of the public jet, the OpenSSL version has a string that is not present on what I get with Swift but the rest is the same.
This is the Swift code to use:
import Foundation
import Security
import CommonCrypto
// Step 1: Extract public key from the certificate
func extractPublicKey(from certificate: SecCertificate) -> SecKey? {
// Extract public key from the certificate
var publicKey: SecKey?
if let publicKeyRef = SecCertificateCopyKey(certificate) {
publicKey = publicKeyRef
}
return publicKey
}
// Step 2: Calculate SHA-256 hash of the public key
func calculateSHA256(of data: Data) -> Data {
var hash = [UInt8](repeating: 0, count: Int(CC_SHA256_DIGEST_LENGTH))
data.withUnsafeBytes {
_ = CC_SHA256($0.baseAddress, CC_LONG(data.count), &hash)
}
return Data(hash)
}
// Step 3: Encode data as base64
func base64EncodedString(from data: Data) -> String {
return data.base64EncodedString()
}
// Step 4: Main function to perform all steps
func processCertificate(certificate: SecCertificate) {
// Step 1: Extract public key
guard let publicKey = extractPublicKey(from: certificate) else {
return
}
// Step 2: Export public key as data
guard let publicKeyData = SecKeyCopyExternalRepresentation(publicKey, nil) as Data? else {
print("Failed to export public key data")
return
}
// Step 3: Calculate SHA-256 hash of the public key
let sha256Hash = calculateSHA256(of: publicKeyData)
// Step 4: Encode SHA-256 hash as base64
let base64EncodedHash = base64EncodedString(from: sha256Hash)
print("SHA-256 hash of public key (base64 encoded): \(base64EncodedHash)")
}
This is the Public Key I get with OpenSSL:
-----BEGIN PUBLIC KEY-----
MIICIjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAregkc/QUN/ObnitXKByHvty33ziQjG485legePd1wqL+9Wpu9gBPKNveaIZsRJO2sWP9FBJrvx/S6jGbIX7RMzy6SPXded+zuP8S8SGaS8GKhnFpSmZmbI9+PHC/rSkiBvPkwOaAruJLj7eZfpQDn9NHl3yZSCNT6DiuTwpvgy7RSVeMgHS22i/QOI17A3AhG3XyMDz6j67d2mOr6xZPwo4RS37PC+j/tXcu9LJ7SuBMEiUMcI0DKaDhUyTsE9nuGb8Qs0qMP4mjYVHerIcHlPRjcewu4m9bmIHhiVw0eWx27zuQYnnm26SaLybF0BDhDt7ZEI4W+7f3qPfH5QIHmI82CJXn4jeWDTZ1nvsOcrEdm7wD+UkF2IHdBbQq1kHprAF2lQoP2N/VvRIfNS8oF2zSmMGoCWR3bkc3us6sWV5onX9y1onFBkEpPlk+3Sb1JMkRp1qjTEAfRqGZtac6UW6GO559cqcSBXhZ7T5ReBULA4+N0C8Fsj57ShxLcwUS/Mbq4FATfEOTdLPKdOeOHwEI0DDUW3E2tAe6wTAwXEi3gjuYpn1giqKjKYLMur2DBBuigwNBodYF8RvCtvCofIY7RqhIKojcdpp2vx9qpT0Zj+s482TeyCsNCij/99viFULUItAnXeF5/hjncIitTubZizrG3SdRbv+8ZPUzQ08CAwEAAQ==
-----END PUBLIC KEY-----
and this is what I get with Swift:
-----BEGIN PUBLIC KEY-----
MIICCgKCAgEAregkc/QUN/ObnitXKByHvty33ziQjG485legePd1wqL+9Wpu9gBPKNveaIZsRJO2sWP9FBJrvx/S6jGbIX7RMzy6SPXded+zuP8S8SGaS8GKhnFpSmZmbI9+PHC/rSkiBvPkwOaAruJLj7eZfpQDn9NHl3yZSCNT6DiuTwpvgy7RSVeMgHS22i/QOI17A3AhG3XyMDz6j67d2mOr6xZPwo4RS37PC+j/tXcu9LJ7SuBMEiUMcI0DKaDhUyTsE9nuGb8Qs0qMP4mjYVHerIcHlPRjcewu4m9bmIHhiVw0eWx27zuQYnnm26SaLybF0BDhDt7ZEI4W+7f3qPfH5QIHmI82CJXn4jeWDTZ1nvsOcrEdm7wD+UkF2IHdBbQq1kHprAF2lQoP2N/VvRIfNS8oF2zSmMGoCWR3bkc3us6sWV5onX9y1onFBkEpPlk+3Sb1JMkRp1qjTEAfRqGZtac6UW6GO559cqcSBXhZ7T5ReBULA4+N0C8Fsj57ShxLcwUS/Mbq4FATfEOTdLPKdOeOHwEI0DDUW3E2tAe6wTAwXEi3gjuYpn1giqKjKYLMur2DBBuigwNBodYF8RvCtvCofIY7RqhIKojcdpp2vx9qpT0Zj+s482TeyCsNCij/99viFULUItAnXeF5/hjncIitTubZizrG3SdRbv+8ZPUzQ08CAwEAAQ==
-----END PUBLIC KEY-----
Interestingly, if I use the Swift version of the Public Key I get and then run the second command I still get the correct final result. Unfortunately in Swift I don't get the correct final result.
I suspect it must be something about headers since I was able to get the correct output on OpenSSL with the public key I got using the Swift.
Any ideas?
Hi, I am creating simple app with ios 17. I want to authenticate via ios passcode. but I couldn't find any example about it. Where can I get some example about using ios passcode in ios 17? please help me.
I mounted a 3rd file system on macOS, I want to monitor the copy event by Finder on this 3rd file system, so I use an Endpoint Security client.
I know that ES_EVENT_TYPE_NOTIFY_CLONE will only be triggered by Apple File System clone operation. ES_EVENT_TYPE_NOTIFY_COPYFILE is triggered by the SYS_copyfile system call.
If I want to monitor the copy/paste operation by Finder(The copy can happens in the 3rd file system or between 3rd and Apple File System), which ES event should I register?
get the password from keychain, encountered an exception, password lost and the reason of the exception is Error Domain=com.samsoffes.sskeychain Code=-25300 ", I want to know why this happens and the possibility of it
I have a request from a client that would like to create a solution where a central component is the ability to retrieve the users app usage. I can see that there are various api's todo that - one example is the Device Activity API. Another hope from the client is that this should be a web solution and not an app. I haven't quite been able to figure out if this is possible or not.
I have three questions:
Is it possible to retrieve app usage information through a REST api (or similar) outside an iOS app?
Is it possible to use the appleId as a kind of auth solution for a web application
Are there any similar APIs for older versions than iOS 15?
What happens if I submit an app and one of the third party libraries (but not in the big list of common third party libraries) in my app has errors in its privacy manifest? Does my app get rejected? Or does Apple go after the third party to fix their library?
The error is simple enough. They simply failed to include the NSPrivacyCollectedDataTypes key. Actually, it is missing other keys but the error report probably stopped at the first one.
The error is from Xcode > Window > Organizer > Archives > command click an archive from the list > Generate Privacy Report.
The exact text of the error in that report is:
Errors Encountered
Missing an expected key: 'NSPrivacyCollectedDataTypes'
I am not concerned with how to fix the syntax. I know that much. I want to know what Apple will do if I submit the app for review with the errors present in the third party lib. There are verbal rumors and speculation that Apple contacts third party library devs and leave app devs alone if a library messes up its privacy file, but I cannot find any confirmation of this on the Internet.
And again, while this lib is from medium size commercial vendor, it is not common enough to be on Apple's list.
So as we know it's nearty 1/5 and the deadline of Privacy Manifest is near, I had take care allmost of case in my project but the things is I keep getting warning emails saying that I need to state for some "required api" that I'm using in my code.
Which I'm current not.
So after thinking abit, I decided to look into the IPA and extract the binary version of my IPA's package content using nm command.
And supprisingly I can easily saw all the "reuqired API" that stated in the email Apple send me for my release
So my question is: Do we really need to handle those case too? because always the "behind the scence" things using those API and honest I can't confirm where did they run or what did they do (due to limit of time until 1/5)
I have a .p12 file which contains two certificates, but no identities. When attempting to use SecPKCS12Import against it it returns a success code, but the CFArray is empty:
func testParsingCert() throws {
let bundle = Bundle(for: Self.self)
let certificateURL = bundle.url(forResource: TestConstants.SERVER_CERTIFICATE_NAME, withExtension: TestConstants.CERTIFICATE_FILE_EXTENSION)!
let certificateData = try! Data(contentsOf: certificateURL)
var importResult: CFArray? = nil
let err = SecPKCS12Import(
certificateData as NSData,
[kSecImportExportPassphrase as String: TestConstants.DEFAULT_CERT_PASSWORD] as NSDictionary,
&importResult
)
guard err == errSecSuccess else {
throw NSError(domain: NSOSStatusErrorDomain, code: Int(err), userInfo: nil)
}
let identityDictionaries = importResult as! [[String:Any]]
var chain: CFArray
chain = identityDictionaries[0][kSecImportItemCertChain as String] as! CFArray
print(chain)
}
Above code fails with
Test Case '-[TAKTrackerTests.CertificateSigningRequestTests testParsingCert]' started.
Swift/ContiguousArrayBuffer.swift:600: Fatal error: Index out of range
as the identityDictionaries result contains no results (nor does importResult)
The specific use case for this is that users can do Certificate Enrollment against a server with a self-signed certificate, so they need to be able to upload the trust store prior to connecting for identities.
Hi,
I have met with a rather interesting phenomenon today and I couldn't figure out the reason.
As part of a script, I import certificates and for that I create a designated keychain:
security create-keychain -p "" $KEYCHAIN_NAME.keychain-db
This has so far been creating the keychain at the expected location, Users/my-user/Library/Keychains/$KEYCHAIN_NAME.keychain-db.
However, I have noticed that since yesterday, my script has been failing with a
security: SecKeychainCreate XXXXXXXXX.keychain-db: UNIX[Permission denied]
error.
I kept investigating and noticed that the same script as given above, now tries to create the keychain on the /Library/Keychains/$KEYCHAIN_NAME.keychain-db path (the same path where System.keychain is located).
I confirmed this in two ways:
running the command with sudo no longer resulted in above UNIX error, instead created it next to the System keychain.
locally, I tried to create a keychain with an absolute path, like this: security create-keychain -p 1234 "/Library/Keychains/new.keychain" and got back the same UNIX[Permission denied] error.
I tried to poke around in the man page for security and search online, but found nothing that would mention the default path changing for the security command (because it must be some setting for security, given that a simple XXXX.keychain would be created at ~/Library/Keychain/***.keychain, whichever folder I execute the command from.
Thanks in advance for any advice!
Hi,
I'm wondering if we'd want to improve the clarity of the Apple Platform Security guide (dated 2022) on the iOS app security model (page 99), as edits might have lost the intended structure of the sentence (although I might be reading it wrong).
Current text:
At runtime, code signature checks that all executable memory pages are made as they are loaded to help ensure that an app hasn’t been modified since it was installed or last updated.
Possible rephrasing:
At runtime, iOS checks code signature on all executable memory pages as they are loaded to help ensure that an app hasn’t been modified since it was installed or last updated.