I'm working on an Angular application that retrieves static data (JSON, MP3, and images) from a backend server, with a cache control response header set to Cache-Control: public, max-age=2592000. I expect these files to be served from either disk or memory cache after the initial request. However, in Safari, the browser sometimes fetches the data from the cache and other times makes a network call. This inconsistent behavior is particularly noticeable with MP3 files, whereas JSON and image files are consistently served from the cache as expected.
I've tested this on multiple Safari versions and observed the same issue:
Version 17.2 (19617.1.17.11.9)
Version 17.1 (19616.2.9.11.7)
Version 17.3 (19617.2.4.11.8)
I confirmed that the "Disable Cache" option is not enabled in the developer tools, so the MP3 files should be cached. This functionality works correctly in Chrome and Firefox without any issues.
Core OS
RSS for tagExplore the core architecture of the operating system, including the kernel, memory management, and process scheduling.
Post
Replies
Boosts
Views
Activity
I built a custom binary which requires elevated privileges to execute. I wrote a launchd plist file and loaded it. On loading , the app is working as expected but lot of apps were corrupted. Apps like chrome, slack, zoom etc became un openable . Even my mouse right click stopped working. When I looked at the contents of the chrome in application directory it was missing few files.
contents of chrome before launching my custom app
-rw-r--r--@ 1 myusername admin 2556 May 14 16:49 CodeResources
drwxr-xr-x@ 3 myusername admin 96 May 14 15:59 Frameworks
-rw-r--r--@ 1 myusername admin 11851 May 14 16:17 Info.plist
drwxr-xr-x@ 3 myusername admin 96 May 14 15:59 Library
drwxr-xr-x@ 3 myusername admin 96 May 14 16:17 MacOS
-rw-r--r--@ 1 myusername admin 8 May 14 15:59 PkgInfo
drwxr-xr-x@ 61 myusername admin 1952 May 14 15:59 Resources
drwxr-xr-x@ 3 myusername admin 96 May 14 16:17 _CodeSignature
-rw-r--r--@ 1 myusername admin 12551 May 14 15:59 embedded.provisionprofile
contents of chrome after launching my custom app
drwxr-xr-x@ 3 myusername admin 96 May 14 15:59 Frameworks
drwxr-xr-x@ 3 myusername admin 96 May 14 15:59 Library
drwxr-xr-x@ 2 myusername admin 64 May 16 13:48 MacOS
drwxr-xr-x@ 58 myusername admin 1856 May 16 13:48 Resources
drwxr-xr-x@ 2 myusername admin 64 May 16 13:48 _CodeSignature
my custom app plist file
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.zzzz.xxxx</string>
<key>Program</key>
<string>/path/to/app</string>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>
If i run my custom app as a standalone process everything works as expected, no corruption or data loss.
To restore the apps, I reinstalled them. But cant figure out why this is happening. Also not sure how to make secondary click work again. I tried with multiple mouses , same issue.
Any help on this is appreciated.
I've read the definitive "Recording Private Data in the System Log" by @eskimo and the words at man 5 os_log and written code to, specifically, turn on "Enable-Private-Data" in my app.
My application is a command line and I've configured Xcode to insert what I believe to be the appropriate incantations in an Info.plist file into the unstructured executable binary. When I run the app with Terminal, I see <private> output in the Console app where I expect values to be displayed in a public manner.
Nothing I've read says that <key>Enable-Private-Data</key><true/> doesn't apply to command line apps, and my own understanding of the value of of the logging mechanism rejects that notion because logging is performed all over macOS, not just in a ***.app environment.
A this point, I'm firmly convinced this unexpected behavior is of my own doing, but I have paused the search for my (probably embarrassing) mistake, to write this note because of a 1% doubt I'm wrong.
I'd be very happy to receive the, expected, assurance that logging configuration via an embedded Info.plist in a command line app does influence logging behavior. With that assurance, I'll know it's my problem and I'll search/find/fix. On there way there, I'll create the simplest command line app that exhibits this anomaly -- which will likely reveal my error and, if not, it'll be fodder for a bug report.
Embedding an Info.plist into a command line app is a tad out of the ordinary but I've done it before (using Xcode or SPM) to carry knowledge into a CLI via a mainBundle.infoDictionary .. and in the particular case described above, I've printed that infoDictionary to show the successful embedding, viz:
. . . .
"OSLogPreferences": {
"com.ramsaycons" = {
"DEFAULT-OPTIONS" = {
"Enable-Private-Data" = 1;
};
};
},
. . . .
Sonoma 14.5 / Xcode 15.4 / MBP (Apple M1 Max)
I've implemented Apple Archive in the app I'm working on and have implemented lzfse algorithm. The framework is extremely impressive in terms of the speed and compression ratios. I'm using the closure in ArchiveStream.writeDirectoryContents to indicate what the current file is being processed, but there doesn't seem to be a way to pull out progress % on each file. I've seen examples here [Accelerate].(https://developer.apple.com/documentation/accelerate/compressing_and_decompressing_files_with_stream_compression) but these aren't using the Apple Archive Framework... I'd really like to stick with the Apple Archive Framework if possible as it's simple and very fast.
When I go jogging early in the morning, I've noticed that that my watch will sometimes ask me if I'm awake, after a few miles or so, if I'm awake. This happens during the time I am actively using the Fitness APP to record my run. I would think the watch would already know that I am wake as soon as the Fitness APP is activated.
We want to overlay a SwiftUI attachment on a RealityView, like it is done in the Diorama sample. By default, the attachments seem to be placed centered at their position. However, for our use-case we need to set a different anchor point, so the attachment is always aligned to one of the corners of the attachment view, e.g. the lower left should be aligned with the attachment's position. Is this possible?
I have the Vision Pro developer strap, do I need to do anything to make Instruments transfer the data over it rather than wifi? Or will it do that automatically?
It seems incredibly slow for transferring and then analysing data.
I can see the Vision Pro recognised in Configurator, so assume it's working.
Otherwise.. Any tips for speeding up Instruments? Capturing 5 mins of gameplay (high-freq) then takes 30-40+ mins to appear in Instruments on an M2 Max 32gb.
Thanks!
Hello, I'm currently integrating a feature in our app that allows customers to set up a passkey. Once set up, users are prompted to use their passkey at the sign-in page. For users without a registered passkey, we ensure that the passkey assertion request fails silently to maintain a smooth login experience, using preferImmediatelyAvailableCredentials effectively for this purpose.
However, we've noticed that when users are employing third-party password managers like 1Password or Bitwarden, they encounter a QR code fallback. Discussions with 1Password have revealed that iOS does not currently extend preferImmediatelyAvailableCredentials to these services.
I would appreciate any advice on how to harmonize the behavior between iCloud Keychain and third-party password providers to ensure a consistent user experience.
I am developing an iOS app that utilizes the timeInDaylight data from HealthKit. This feature is available starting from iOS 17.0+, but it can only be recorded using the Apple Watch SE (2nd generation) and Apple Watch Series 6 or later.
How should I release this app given its dependency on timeInDaylight data? Without this data, the app is useless.
I understand that the App Store does not allow setting specific health data requirements. I am also concerned that including a "device requirements" warning in the App Store description might not pass the review process.
Could you provide any advice on how to approach this situation?
Thank you.
I have some usdz files saved and I would like to make thumbnails for them in 2D of course. I was checking Creating Quick Look Thumbnails to Preview Files in Your App but it says Augmented reality objects using the USDZ file format (iOS and iPadOS only) I would like to have the same functionality in my visionOS app. How can I do that?
I thought about using some api to convert 3d asset into 2d asset, but it would be better If I could do that inside the Swift environment.
Basically I wanna do Image(uiImage: "my_usdz_file")
New to Apple development. Vision Pro is the reason I got a developer license and am learning XCode, SwiftUI ....
The Vision Pro tutorials seem to use WIFI or the developer strap to connect the Development environment to the Vision Pro. I have the developer strap, but can't use it on my company computer. I have been learning using the developer tools, but I can't test the apps on my personal Vision Pro.
Is there a way to generate an app file on the Mac Book that I can download to the Vision Pro? This would be a file that I could transfer to cloud storage and download using Safari to the Vision Pro.
I will eventually get a Vision Pro at work, but till then I want to start developing.
Hi there! I am trying to build a macOS app using Electron. There is a feature on the app that depends on a http server to run locally. This Server was built using Java.
Both the compiled server and the Java Runtime Environment were bundled in the build. To start the server I use NodeJS's child_process.spawn, pointing the bundled JRE's executable and the server implementation.
The issue I am facing is that the Java Virtual Machine is not starting. It returns the following error message:
Error: Port Library failed to initialize: -1
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Both the JRE and the server are located in Contents directory, in a subdirectory I have created for them.
Here are the app's entitlements:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.application-groups</key>
<string>REDACTED</string>
<key>com.apple.application-identifier</key>
<string>REDACTED</string>
<key>com.apple.developer.team-identifier</key>
<string>REDACTED</string>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
<key>com.apple.security.cs.disable-executable-page-protection</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.network.client</key>
<true/>
<key>com.apple.security.network.server</key>
<true/>
<key>com.apple.security.device.microphone</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.device.camera</key>
<true/>
<key>com.apple.security.print</key>
<true/>
<key>com.apple.security.files.user-selected.read-write</key>
<true/>
<key>com.apple.security.temporary-exception.files.absolute-path.read-write</key>
<true/>
</dict>
</plist>
Here the entitlements inherit:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.inherit</key>
<true/>
</dict>
</plist>
Is there any missing step to allow the spawning of this process?
I am going through the list of ways to check if my app is given Full Disk Access (FDA) or not. Out of which only one method is supported by apple.
@note The only supported way to check if an application is properly TCC authorized for Full Disk Access
* is to call es_new_client and handling ES_NEW_CLIENT_RESULT_ERR_NOT_PERMITTED in a way appropriate
* to your application.
I have implemented this method using EndpointSecurity and calling it from a root process as required. But when I disable System Integrity Protection (SIP) and call it, it succeeds without FDA. No error is thrown. Then I tested, in our app both EndpointSecurity and protected folder access (like Documents folder) functionalities are working fine even without FDA when SIP is disabled. Now my questions are
When SIP disabled, does every app has FDA access by default?.
Is there any use case that still needs FDA access when SIP is off?.
Is there any way to check for FDA permission given or not whenever SIP is off, since above method won't work in that case?.
iMessage sometimes does not show Firebase Dynamic Link preview, while other apps like Messenger or Slack always work. I don't know if it is because of Firebase or the image itself, as there are some images which do show up in preview and some images don't.
If anyone has any clue, please let me know. Thanks!
I tried to enroll Apple Developer Program but after the consent page it always show this "Your enrollment in the Apple Developer Program could not be completed at this time.". Please help me to fix this problem. Thank you!!
I made a simple app in Swift.
Compilation was on MacBook Pro 16 (Apple M3 Pro) / Sonoma 14.4.1, and the XCode version used was 15.3.
When I run this app on Big Sur(11.5.2) / Intel Core i5 iMac, following message is noticed.
"The 'MYAPP' application cannot be opened because this application is not supported on this Mac."
If I run MyApp.app/Contents/MacOS/myapp directly, “...Bad CPU type in executable”is displayed.
Build Settings are as follows:
Minimum Deployments: macOS 11.0
Architectures: Standard Architectures (Apple Silicon, Intel) - $(ARCHS_STANDARD)
Build Active Architecture Only : Debug - Yes / Release - No
Code Signing by : Development ID
and notarization is completed
Additionally, SimpleFirewall (The Network Extension example) works well on Big Sur.
I compared all settings with The SimpleFirewall project and couldn't find anything
unusual.
Hi everyone ,
I would like to ask about system setting, does any setting can make the Mac wakeup faster when it display off after logout.?
it look like it will take around 20s to show the login screen after key press.
I want it be faster. do any setting for this behavior ?
Thanks for all.
In this picture, users can find '楽天ラクマ' by entering 'メルカリ'. How do I set keywords link 'メルカリ' for an app link '楽天ラクマ'.
I thought the apps searched in App Library in iPhone by only their name (CFBundleDisplayName), but it is not fact.
i installed @react-native-firebase/firestore in my react-native project, i changed my podfile as i did previously when i installed Firebase for push notifications, now i'm trying to store my fcm token in Firestore and while building the app i got this error Framework 'FirebaseFirestoreInternal' not found . but i have installed it, its in my podfile :
pod 'Firebase', :modular_headers => true
pod 'FirebaseCoreInternal', :modular_headers => true
pod 'GoogleUtilities', :modular_headers => true
pod 'FirebaseCore', :modular_headers => true
pod 'FirebaseFirestore', :modular_headers => true
pod 'FirebaseCoreExtension', :modular_headers => true
pod 'FirebaseFirestoreInternal', :modular_headers => true
and even in my podfile.lock - FirebaseFirestoreInternal (10.24.0) . those first 4 lines had no issue, i successfully installed and tested push notifications but this FirebaseFirestoreInternal got some problem, i don't know why, can anyone help?
The control in the picture, I need to use any programming way to call out
The AppleScriptI use is as follows:osascript -e 'set volume output volume 50'