System Programming in MacOS.. where to start?

Hi All,

We have a couple of Desktop softwares that were written in Win32 C++ that we are planning to port to a native Mac OS desktop application. We have a clients requesting for it so there is a need.The thing is, these software kind of accessed things beyond or outside its own application and deals with the actual OS interface itself.Imagine a windows application, running in the background with an icon in the system tray, but for some time it wakes up and clicks some buttons on a separate window and also clicks on a text box and "types" items to it.We did it easily in windows as using their native Win32 exposed a lot of things like I can control the mouse and can sendkeys to the OS itself.

Is this possible in Mac OS and if so, what things do we need to check/study to get this, what are the key words we need to search to point us to the right direction? Thank you in advance.

PS. We are aware of the security risk of this, that is why we ask if its possible in Mac OS (as it can be done in Windows), and these software we have needs to be provided the right access to run.

It is possible to send mouse and keyboard events using functions such as CGEventPost and CGEventPostToPid. Doing so requires that the user grant permission in System Settings > Privacy & Security > Accessibility. You can request that the user be asked to provide this permission using CGRequestPostEventAccess.

I know less about how to actually locate buttons and text fields in an app that is not your own. I think this would involve accessibility APIs. The Accessibility Programming Guide unfortunately seems to be all about making an app accessible, not using accessibility APIs. Here's a possible place to start: https://stackoverflow.com/questions/69002718/how-get-ui-elements-of-a-window-swift

System Programming in MacOS.. where to start?
 
 
Q