Hi,
When we using Safari on MacOS with IPv6 we found that the XSRF-TOKEN can be set into the cookies. We have set-cookie in the authentication response headers 'Set-Cookie: XSRF-TOKEN=*******; SameSite=Strict; Secure'.
It works by using Safari with IPv4. And also works with Chrome/FireFox with IPv4/IPv6. And also worked with Safari 15.6.1 over IPv6.
May I know if this an issue or by design? Anyone aware of this?
Thanks.
Posts under macOS tag
177 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Xcode 16 does not display simulators. Problem running on physical device with iOS 18.
error HE0004: Could not load the framework 'IDEDistribution' (path: /Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/IDEDistribution):
dlopen(/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/IDEDistribution, 0x0001): Library not loaded: @rpath/AppThinning.framework/Versions/A/AppThinning
Referenced from: <33FF2F3B-A96F-37B4-BA4E-887BD882BF9D> /Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/IDEDistribution
Reason: tried: '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/MonoBundle/AppThinning.framework/Versions/A/AppThinning' (no such file)
Hi everyone,
I recently upgraded my Xcode to version 16, and since then, I’ve encountered some issues with Visual Studio 2022 for Mac. The iOS simulators have disappeared, except for the generic simulator. When I connect my iPhone via USB, Visual Studio recognizes it, but when I attempt to run my project in debug mode, I receive a status message saying "waiting for debugger to connect to iPhone on port 10000 via USB." After a few minutes, this process fails.
Additionally, when I try to run the project in release mode, I encounter the following error:
dlopen(/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/IDEDistribution, 0x0001): Library not loaded: @rpath/AppThinning.framework/Versions/A/AppThinning
Referenced from: <33FF2F3B-A96F-37B4-BA4E-887BD882BF9D> /Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/IDEDistribution
Reason: tried: '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Applications/Xcode.app/Contents/SharedFrameworks/IDEDistribution.framework/Versions/A/Frameworks/AppThinning.framework/Versions/A/AppThinning' (no such file), '/Library/Frameworks/Xamarin.iOS.framework/Versions/16.4.0.23/lib/mlaunch/mlaunch.app/Contents/MonoBundle/AppThinning.framework/Versions/A/AppThinning' (no such file)
I suspect that the new version of Xcode may not be fully compatible with Visual Studio 2022. Unfortunately, I cannot roll back to an older version of Xcode because I have updated my macOS to Sequoia (15.0.1), which appears to be incompatible with previous Xcode versions.
It is quite frustrating as I can't debug my projects and my works are stalled.
Does anyone have any suggestions or solutions to this problem?
Thank you in advance for your assistance!
Is there any way to get dock size?
I got Dock Window using CGWindowListCopyWindowInfo.
But I found its frame covers my entire screen.
Same promble as Notification Center.
Any way to get their size?
My post i the general forum was deleted because I am running on Sequoia beta. The issue seems to occur regardless of macOS version.
MacBook Air (M1, 2020):
Trackpad reacts to the slightest touch, either moving thw cursor or clickin. Obviously I have adjusted settings and it's still too sensitive.
If I don't take my fingers completely away from the trackpad as I use it, I often find new windows opening, dialog boxes popping up, ads getting clicked on, etc.
I have gone into trackpad preferences several times and adjusted everything I can find that ought to help and it doesn't. Disabling “touch to click” remedies the unintentional clicks, although I would like to keep that functionality of workin as intended. The other nuisances remain.
I'm building a SwiftUI+RealityKit app for visionOS, macOS and iOS. The main UI is a diorama-like 3D scene which is shown in orthographic projection on macOS and as a regular volume on visionOS, with some SwiftUI buttons, labels and controls above and below the RealityView.
Now I want to add UI that is positioned relative to some 3D elements in the RealityView, such as a billboarded name label over characters with a "show details" button and such.
However, it seems the whole RealityView Attachments API is visionOS only? The types don't even exist on macOS. Why is it visionOS only? And how would I overlay SwiftUI elements over a RealityView using SwiftUI code on macOS if not with attachments?
I explored several methods to trigger a 35mm camera connected via USB:
1- ICCameraDevice: Unable to make it work with Canon cameras (details).
2- Canon's EDSDK: Works but is complex to implement.
3- gPhoto2 (command-line): Simple to use but requires gPhoto2 to be installed.
In your opinion, what is the most efficient way to trigger and download images via USB from Canon cameras?
Are “Notification Service Extensions” officially supported on macOS?
I’m developing an app for both iOS and macOS (not Catalyst). I’ve successfully setup a separate notification service extension for both the iOS and macOS targets. The iOS extension is modifying the CKSubscription push notification as expected. However the macOS notification service extension is not being launched at all no matter what I seem to try, matching deployment targets etc.
I’m also asking because although Apple docs report that support for UNNotificationServiceExtension was added in macOS 10.14, the article at https://developer.apple.com/documentation/usernotifications/modifying_content_in_newly_delivered_notifications makes no mention of macOS, only iOS.
I'm trying to setup a listener for kAudioProcessPropertyIsRunningOutput but it's never triggered. I get calls for kAudioProcessPropertyIsRunning and kAudioProcessPropertyDevices but not for kAudioProcessPropertyIsRunningInput or kAudioProcessPropertyIsRunningOutput.
class MyDelegate: PropertyListenerDelegate {
func propertiesChanged(properties: [AudioObjectPropertyAddress]) {
print(properties)
}
}
var myDelegate = MyDelegate()
var processes = try AudioHardwareSystem.shared.processes
for process in processes {
process.delegates += [myDelegate]
try process.addListener(forProperties: [AudioObjectPropertyAddress(mSelector: kAudioPropertyWildcardPropertyID, mScope: kAudioObjectPropertyScopeWildcard, mElement: kAudioObjectPropertyElementWildcard)])
}
Xcode 16.1
macOS 15.0.1
Is it possible to play certain sounds from a macOS app that remain at full volume when VoiceOver is speaking?
Here's some background:
I want to play sounds from my app (even when it's not in focus) to notify a VoiceOver user that an action is available (this action can be triggerred even when the app is not in focus; and is comfigurable by the user).
I tired using an NSSound for this, but VoiceOver ducks the audio of my sound when it is speaking.
Is there some way to avoid audio ducking for certain sounds? Or is there another, perhaps lower level, audio API that i can use to achieve this?
we are trying to build MacOS Desktop app using electron code sign-in and notarization has completed basically it is angular application but still unable to open the desktop app getting below error pop-up : screenshot of it.
CrashReporter Key: XXXX-XXXX-XXXX-XXXX-XXXX
Hardware Model: MacBook Pro (Obfuscated)
Process: xnode [5798]
Path: /Applications/[App Path]/Contents/MacOS/xnode
Identifier: ai.xnode.xnode
Version: 1.0.0 (1.0.0.43313)
Code Type: X86-64 (Native)
Role: Default
Parent Process: launchd [1]
Coalition: ai.xnode.xnode [5056]
Date/Time: [Redacted for Privacy]
OS Version: macOS 14.6.1 (23G93)
Release Type: User
Report Version: 104
Exception Type: EXC_CRASH (SIGKILL (Code Signature Invalid))
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: CODESIGNING 1 Taskgated Invalid Signature
Triggered by Thread: 0
Thread 0 Crashed:
0 dyld_path_missing 0x10dbb4010 _dyld_start + 0
1 main_executable_path_missing 0x10b395000 ???
Thread 0 crashed with X86 Thread State (64-bit):
rax: 0x0000000000000000 rbx: 0x0000000000000000 rcx: 0x0000000000000000
rdx: 0x0000000000000000 rdi: 0x0000000000000000 rsi: 0x0000000000000000
rbp: 0x0000000000000000 rsp: 0x00007ff7b4b6abf8 rip: 0x000000010dbb4010
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
macOS
Mac App Store
Code Signing
I am working on releasing my macOS arm64 app. My problem is that after the user downloads the dmg, double-clicking my.app in the dmg, a Gatekeeper pop-up box will appear with a warning that the developer cannot be verified.
Question: Can an application signed with "com.apple.security.cs.disable-library-validation" be published as trusted?
If yes, what steps have I missed?
If not, can I get an official response from Apple?
(Because I referred to this post, it seems to mention that it is possible to publish trusted software.I have looked up similar questions on the forum and tried many things, but nothing works. )
Here are my steps:
Use the codesign to sign my.app. Because my app needs to access third-party dynamic libraries, entitlements.plist contains a "com.apple.security.cs.disable-library-validation". After the "codesign -dvvv" check, the signature was successful.✅
Use the "xcrun notarytool" command to notarize my app, and the status is displayed as accepted.✅
Use "xcrun stapler staple" to attach the notarization to my app, and it returns success.✅
Use the "spctl -a -v " command to verify whether my app has passed Gatekeeper, and it returns that it has passed.✅
Then I packaged my.app into a dmg, and then attached the notarization mark to the dmg, which was successful.✅
I completed the above steps and distributed the dmg. When I downloaded the dmg as a user test and double-clicked my.app in it, the Gatekeeper pop-up box still appeared, and the developer cannot be verified.❌
Hi Team,
Looking for an answer, if it's just us or a widespread issue.
Since Sept, our clients Apple devices can't load a Captive Portal on Apple devices. Client wants the CNA to pop up and I can't get it to happen!
Android and Windows devices all work correctly with their respective popups, but CNA will not work.
No changes done on our side and after multiple tshoots and getting vendors to take multiple PCAPs found, Apple devices are not initiating a HTTP GET request as per Meraki >> https://documentation.meraki.com/MR/MR_Splash_Page/Splash_Page_Traffic_Flow_and_Troubleshooting
The work around is to force a HTTP GET request by manually going into the browser and initiate a http site (we tried 1.1.1.1, also tried other public HTTP sites and it works) and that redirects to our Captive Portal page.
Hey everyone,
I've been updating my code to take advantage of the new Vision API for text recognition in macOS 15. I'm noticing some very odd behavior though, it seems like in general the new Vision API consistently produces worse results than the old API. For reference here is how I'm setting up my request.
var request = RecognizeTextRequest()
request.recognitionLevel = getOCRMode() // generally accurate
request.usesLanguageCorrection = !disableLanguageCorrection // generally true
request.recognitionLanguages = language.split(separator: ",").map { Locale.Language(identifier: String($0)) } // generally 'en'
let observations = try? await request.perform(on: image) as [RecognizedTextObservation]
Then I will process the results and just get the top candidate, which as mentioned above, typically is of worse quality then the same request formed with the old API.
Am I doing something wrong here?
1.Open the DingTalk macOS application, start a meeting, and initiate screen sharing.
2.The DingTalk app calls the [SCShareableContent getShareableContentWithCompletionHandler:] API.
3.The API takes over 40+ seconds to return a response.
I wrote a Keychain controller that add, delete and fetch keychain items using SecItemAdd(_:_:)and related APIs with data protection keychain enabled (kSecUseDataProtectionKeychain). I am using it in a macOS Cocoa app.
I am using Swift Testing to write my tests to ensure that the controller works as expected.
As I understand, I should create my own keychain for testing rather than use the actual keychain in macOS. Currently, I created a separate keychain group (e.g. com.testcompany.testapp.shared) and added it to myapp.entitlements file so that the tests pass without failing because of the missing entitlement file.
SecKeychainCreate(_:_:_:_:_:_:) and SecKeychainDelete(_:) API are deprecated with no alternative provided in the documentation. I noticed SecKeychain class but documentation doesn't explain much about it.
How should I test my keychain controller properly so that it does not use the actual macOS keychain, which is the "production" keychain?
Hey folks,
I want VoiceOver to speak punctuation in certain cases. On iOS, there seems to be the UIAccessibilitySpeechAttributePunctuation attributed string key to achieve that, but I can't find an alternative for macOS.
What is the recommended approach for achieving the same result?
Hi,
I test AVMIDIPlayer in order to replace classes written based on AVAudioEngine with callbacks functions sending MIDI events
to test, I use an NSMutableData filled with:
the MIDI header
a track for time signature
a track containing a few midi events.
I then create an instance of the AVMIDIPlayer using the data
Everything works fine for some instrument (00 … 20) or 90 but not for other instruments 60, 70, …
The MiDI header and the time signature track are based on the MIDI.org sample,
https://midi.org/standard-midi-files-specification
RP-001_v1-0_Standard_MIDI_Files_Specification_96-1-4.pdf
the midi events are:
UInt8 trkEvents[] = {
0x00, 0xC0, instrument, // Tubular bell
0x00, 0x90, 0x4C, 0xA0, // Note 4C
0x81, 0x40, 0x48, 0xB0, // TS + Note 48
0x00, 0xFF, 0x2F, 0x00}; // End
for (UInt8 i=0; i<3; i++) {
printf("0x%X ", trkEvents[i]);
}
printf("\n");
[_midiTempData appendBytes:trkEvents length:sizeof(trkEvents)];
A template application is used to change the instrument in a NSTextField
I was wondering if specifics are required for some instruments?
The interface header:
#import <AVFoundation/AVFoundation.h>
NS_ASSUME_NONNULL_BEGIN
@interface TestMIDIPlayer : NSObject
@property (retain) NSMutableData *midiTempData;
@property (retain) NSURL *midiTempURL;
@property (retain) AVMIDIPlayer *midiPlayer;
- (void)createTest:(UInt8)instrument;
@end
NS_ASSUME_NONNULL_END
The implementation:
#pragma mark -
typedef struct _MThd {
char magic[4]; // = "MThd"
UInt8 headerSize[4]; // 4 Bytes, MSB first. Always = 00 00 00 06
UInt8 format[2]; // 16 bit, MSB first. 0; 1; 2 Use 1
UInt8 trackCount[2]; // 16 bit, MSB first.
UInt8 division[2]; //
}MThd;
MThd MThdMake(void);
void MThdPrint(MThd *mthd) ;
typedef struct _MIDITrackHeader {
char magic[4]; // = "MTrk"
UInt8 trackLength[4]; // Ignore, because it is occasionally wrong.
} Track;
Track TrackMake(void);
void TrackPrint(Track *track) ;
#pragma mark - C Functions
MThd MThdMake(void) {
MThd mthd = {
"MThd",
{0, 0, 0, 6},
{0, 1},
{0, 0},
{0, 0}
};
MThdPrint(&mthd);
return mthd;
}
void MThdPrint(MThd *mthd) {
char *ptr = (char *)mthd;
for (int i=0;i<sizeof(MThd); i++, ptr++) {
printf("%X", *ptr);
}
printf("\n");
}
Track TrackMake(void) {
Track track = {
"MTrk",
{0, 0, 0, 0}
};
TrackPrint(&track);
return track;
}
void TrackPrint(Track *track) {
char *ptr = (char *)track;
for (int i=0;i<sizeof(Track); i++, ptr++) {
printf("%X", *ptr);
}
printf("\n");
}
@implementation TestMIDIPlayer
- (id)init {
self = [super init];
printf("%s %p\n", __FUNCTION__, self);
if (self) {
_midiTempData = nil;
_midiTempURL = [[NSURL alloc]initFileURLWithPath:@"midiTempUrl.mid"];
_midiPlayer = nil;
[self createTest:0x0E];
NSLog(@"_midiTempData:%@", _midiTempData);
}
return self;
}
- (void)dealloc {
[_midiTempData release];
[_midiTempURL release];
[_midiPlayer release];
[super dealloc];
}
- (void)createTest:(UInt8)instrument {
/* MIDI Header */
[_midiTempData release];
_midiTempData = nil;
_midiTempData = [[NSMutableData alloc]initWithCapacity:1024];
MThd mthd = MThdMake();
MThd *ptrMthd = &mthd;
ptrMthd->trackCount[1] = 2;
ptrMthd->division[1] = 0x60;
MThdPrint(ptrMthd);
[_midiTempData appendBytes:ptrMthd length:sizeof(MThd)];
/* Track Header
Time signature */
Track track = TrackMake();
Track *ptrTrack = &track;
ptrTrack->trackLength[3] = 0x14;
[_midiTempData appendBytes:ptrTrack length:sizeof(track)];
UInt8 trkEventsTS[]= {
0x00, 0xFF, 0x58, 0x04, 0x04, 0x04, 0x18, 0x08, // Time signature 4/4; 18; 08
0x00, 0xFF, 0x51, 0x03, 0x07, 0xA1, 0x20, // tempo 0x7A120 = 500000
0x83, 0x00, 0xFF, 0x2F, 0x00 }; // End
[_midiTempData appendBytes:trkEventsTS length:sizeof(trkEventsTS)];
/* Track Header
Track events */
ptrTrack->trackLength[3] = 0x0F;
[_midiTempData appendBytes:ptrTrack length:sizeof(track)];
UInt8 trkEvents[] = {
0x00, 0xC0, instrument, // Tubular bell
0x00, 0x90, 0x4C, 0xA0, // Note 4C
0x81, 0x40, 0x48, 0xB0, // TS + Note 48
0x00, 0xFF, 0x2F, 0x00}; // End
for (UInt8 i=0; i<3; i++) {
printf("0x%X ", trkEvents[i]);
}
printf("\n");
[_midiTempData appendBytes:trkEvents length:sizeof(trkEvents)];
[_midiTempData writeToURL:_midiTempURL atomically:YES];
dispatch_async(dispatch_get_main_queue(), ^{
if (!_midiPlayer.isPlaying)
[self midiPlay];
});
}
- (void)midiPlay {
NSError *error = nil;
_midiPlayer = [[AVMIDIPlayer alloc]initWithData:_midiTempData soundBankURL:nil error:&error];
if (_midiPlayer) {
[_midiPlayer prepareToPlay];
[_midiPlayer play:^{
printf("Midi Player ended\n");
[_midiPlayer stop];
[_midiPlayer release];
_midiPlayer = nil;
}];
}
}
@end
Call from AppDelegate
- (IBAction)actionInstrument:(NSTextField*)sender {
[_testMidiplayer createTest:(UInt8)sender.intValue];
}
I want to get the content present in a note or a mail thread when my cursor is in it on macos. (In the focused element).
But I don't succeed to get any element from both mail and notes using :
let result = AXUIElementCopyAttributeValue(appRef, kAXFocusedUIElementAttribute as CFString, &focusedElement)
Even when I want to check the available attribute :
let result = AXUIElementCopyAttributeNames(element, &attributeNames)
I got AXUIElementCopyAttributeNames result: AXError(rawValue: -25204)
But I have the write permission because when I am running AXIsProcessTrusted() to see if I got accessible permission it don't throw an error
Is it possible to do it that way or I have to change
I have tests where I connect to NEPacketTunnelProvider. I run tests with circleci and fastlane, on self hosted intel and arm macs. I updated macs from macOS 13 to macOS 14 and the tests on arm stopped connecting, while the same tests on intel kept working as usual. Moreover, I noticed the tests don't work when run from circleci and fastlane. If I cancel the job and click "connect" myself on the app that stayed hanging from the cancelled tests, the connection will succeed. But if the tests are running, the connection will fails. Running the tests from xcode succeeds too.
These are the logs from the tunnel. Could you suggest me where to dig? Or maybe you can see the issue from the logs?
Tunnel logs when they fail