Is the issue of code-theft via decompilation or reverse engineering common for Swift iOS apps? And can I protect a small portion of my code?

I'm a new app developer and I've read through most relevant posts on this topic here and elsewhere. Many of the forum posts here are specific to Objective-C, or old enough to be considered outdated in the fast-moving world of computing. Many of the posts elsewhere are about protecting authentication secrets, which doesn't apply in my case, and a lot are by someone with a product to sell, which I've ignored.

My app is 99.9% Swift and I'm not going to store any authentication secrets in the IPA. What I'd like to protect is the core mechanism of my product, which has to be included in the binary and is small (< 10k lines). I want to make it so it's harder to steal the source code than it is to recreate my functionality from scratch, which is difficult even with the app in front of them.

From what I gathered, Swift code compiled by Xcode is protected from reverse engineering / decompilation by the following:

  1. Symbolization of the app
  2. Native builds from Xcode destroys names of variable, functions, etc.
  3. Swift code is compiled in such a way that makes stealing harder than Objective-C

This should make me feel better, but the threat-level is increasing with the availability of free, commercial-grade decompilers (e.g. Ghidra) and machine learning. The fact that iOS 18 supports a checkm8 (i.e. jailbreakable) device means that decrypting the IPA from memory is still trivial.

Questions

  1. People talk about stealing authentication secrets via reverse-engineering, but is the same true for mechanisms (i.e. code)?
  2. How common is the issue of source-code stealing in iOS apps?
  3. Can machine learning be leveraged to make decompilation/reverse engineering easier?
  4. Will I get rejected by App Review for obfuscating a small portion of my code?

I don't have an answer to all your questions, and I'm not an expert, so take this with a grain of salt.

Symbolization of the app

I think you mean "desymbolication", which removes some symbols from your app (like variable names).

Native builds from Xcode destroys names of variable, functions, etc.

Yes, though some class / method names will still be visible.

Swift code is compiled in such a way that makes stealing harder than Objective-C

It's a bit harder to reverse engineer right now - decompilation usually produces slightly more complicated code compared to Obj-C. Though it's not impossible.

The fact that iOS 18 supports a checkm8 (i.e. jailbreakable) device means that decrypting the IPA from memory is still trivial.

You can also download some iOS apps on macOS from the Mac App Store now (developers can opt out from allowing this). Which makes it easier to gain the IPA without jailbreaking.

Also, the IPA is not encrypted on your iPhone, but using such methods was needed to get the IPA, as you couldn't download iOS apps on devices other than iPhones.

People talk about stealing authentication secrets via reverse-engineering, but is the same true for mechanisms (i.e. code)?

Authentication secrets are meant to be secret, and they probably pose a greater danger than stealing mechanisms. Regarding code stealing, as long as the code runs on a device, I think saying that you can't completely hide it is a correct assumption. I'm not sure how common code stealing is.

Can machine learning be leveraged to make decompilation/reverse engineering easier?

You can ask generative ML models about what a portion of decompiled code might do and it might provide some good results. I think it could be leveraged and we might see some new reverse engineering tools using them in the future, but we'll have to wait.

Will I get rejected by App Review for obfuscating a small portion of my code?

No, since they don’t see your source code.

I don’t think reverse engineering is something you need to worry about. It will almost always be easier for someone to re-implement code than to reverse-engineer it. There are some exceptions, including when you have server communication, and the attacker wants to pretend to be your app when talking to your server.

Of more concern is extracting non-code, for example the expensively-designed graphical or sound resources in a good game.

If you really think that someone might want to steal your (source) code, I think attacks like phishing or guessing your github password or supply-chain attacks could be of more concern.

Thank you both for responding! Since most of my questions on the forum go unanswered, I feel spoiled just by having two responders.

Yes, though some class / method names will still be visible.

Is there a way to tell which ones will be?

Also, the IPA is not encrypted on your iPhone, but using such methods was needed to get the IPA, as you couldn't download iOS apps on devices other than iPhones.

Sorry, I meant decrypted binary. Basically, I read up on how people obtain and tweak IPAs, and the process seems to be to:

  1. Obtain an IPA via jailbroken iDevice. While the IPA is accessible (so don't store secrets in there), the compiled app binary is decrypted
  2. Start the app on the iDevice, which loads the decrypted app into memory
  3. Using various tools, capture the decrypted app from memory
  4. Tweak and repackage

Of more concern is extracting non-code, for example the expensively-designed graphical or sound resources in a good game.

I encrypt all assets at rest, but decrypt them to load into memory when needed. I think I'm protected by this process, but If there's a way to pull the assets from memory nothing I do would help. Is this assumption true?

P.S. There seems to be an issue with how quotes are handled by this forum; multiple quotes can lead to misaligned formatting

Is there a way to tell which ones will be?

I'm not sure but I think most class names and public methods, because Swift and Objective-C include runtime type information. Sometimes also private methods might appear.

You can disassemble your file with and you'll see them.

Sorry, I meant decrypted . Basically, I read up on how people obtain and tweak IPAs, and the process seems to be to:

Sorry, my bad, I got confused - I think IPAs are not encrypted when you download iOS apps on a Mac (at least it seems so, but I need to check that). But yes, they are encrypted on iOS until runtime. You are right about that.

I encrypt all assets at rest, but decrypt them to load into memory when needed. I think I'm protected by this process, but If there's a way to pull the assets from memory nothing I do would help. Is this assumption true?

This is obfuscation - one can reverse this mechanism and obtain your asset decryption key depending on how this works. So I wouldn't say they are "protected", but rather "better hidden".

One more thing: as you said, an idea is to make it easier to remake the mechanism from scratch rather than copy your app's code. But I think you should also focus on branding your product so that even if clones / alternatives appear, yours will still be the best one, the one that people choose. Again, I'm not experienced in publishing apps yet, but this is what makes sense to me. It might be easier said than done.

This is obfuscation - one can reverse this mechanism and obtain your asset decryption key depending on how this works. So I wouldn't say they are "protected", but rather "better hidden".

Even if I store it in the enclave? My current system creates a unique key for each user-device combination, and uses that to encrypt assets at rest. The enclave key is said to never leave the enclave, so I'm convinced that this is the best I can do within the confines of the framework and platform.

If someone can pull assets from memory then no amount of clever encryption will help unless the memory itself is encrypted, but this is one of the few security features that Apple hasn't implemented.

But I think you should also focus on branding your product so that even if clones / alternatives appear, yours will still be the best one, the one that people choose.

This is a good suggestion and I hope to succeed at this, but having two protections is better than one. Your suggestion is based on the idea that apps can have a good first-mover's advantage, and I agree, but I've heard horror stories of apps being cloned within moments of release, and the cloner has an advantage since they're just going to leap-frog off my work.

My current system creates a unique key for each user-device combination, and uses that to encrypt assets at rest.

OK, so these must be assets that the app downloads, not things that are included in the bundle that you distribute, right? I was referring to assets, like game graphics, sounds etc. that are in the app bundle.

With your scheme, make sure you test what happens when a user gets a new device and your app is copied over automatically for them. Do the keys get copied over too? Do they still work? This has caused me pain. In particular, users may not use your app for some time after getting the new device and not associate the device replacement with the app not working anymore.

Even if I store it in the enclave? My current system creates a unique key for each user-device combination, and uses that to encrypt assets at rest. The enclave key is said to never leave the enclave, so I'm convinced that this is the best I can do within the confines of the framework and platform.

Oh, well, that could work, but, as you said:

If someone can pull assets from memory then no amount of clever encryption will help unless the memory itself is encrypted

That is also true.

but this is one of the few security features that Apple hasn't implemented.

We are talking here about jailbroken devices - i.e. devices whose security features have been bypassed. There are already many security features that need to be bypassed in order to gain such control, and adding such a feature might slow down your device / be useless depending on how it's implemented (one already has broad access in a jailbroken phone, and that might include access to this hypothetical process memory decryption key). Maybe something can be done using the enclave, but remember that we're talking about process memory, and there are many processes on iOS.

the idea that apps can have a good first-mover's advantage

Yes, but also, even if it isn't the first one, there could be reasons why that app is better. First one might not necessarily be the best one.

With your scheme, make sure you test what happens when a user gets a new device and your app is copied over automatically for them.

This is a really good point I haven't thought of. Looks like I'll either have to delete everything (with explanation) or have some way of detecting a transfer and then re-encrypting. Without transfer detection this re-encryption mechanism can be abused.

@CMDdev

one already has broad access in a jailbroken phone, and that might include access to this hypothetical process memory decryption key

I'm fairly sure the enclave key never leaves the enclave and therefore can't be intercepted unless the enclave itself was somehow compromised, but I think a compromised enclave and its implications would mean Apple's whole security model is broken.

The attackers not being able to get a key doesn't matter if everything the key is meant to protect can be captured via memory. Moreover, enclave keys should only be a portion of all types of keys, with many needing to stay in memory--are they all vulnerable? I don't think Apple security engineers are fools; I think we might be missing something if decrypted memory capture really is as simple as it sounds.

Yes, but also, even if it isn't the first one, there could be reasons why that app is better. First one might not necessarily be the best one.

This also applies to most Apple hardware, which tend not to be the first of their kind but (usually) the most well-executed.

P.S. Turns out the word "nuk.e" (sans period) is censored. I don't get it.

I'm fairly sure the enclave key never leaves the enclave and therefore can't be intercepted unless the enclave itself was somehow compromised, but I think a compromised enclave and its implications would mean Apple's whole security model is broken.

Yeah that’s true, the key would be protected (I was talking about protecting process memory, not your idea of encrypting assets).

If you had kernel-level control over a jailbroken device, maybe you could leverage this and make the OS decrypt the process memory (if such APIs would exist in this hypothetical OS).

The attackers not being able to get a key doesn't matter if everything the key is meant to protect can be captured via memory.

I was talking about encrypting the process memory itself (the feature you said Apple did not implement), but yeah, it totally depends on how you would implement such a feature.

Moreover, enclave keys should only be a portion of all types of keys, with many needing to stay in memory--are they all vulnerable?

I was talking about a hypothetical scenario. Depends on how you use the keys in the enclave.

I don't think Apple security engineers are fools; I think we might be missing something if decrypted memory capture really is as simple as it sounds.

Yeah, well, to gain the OS rights required to read other processes’ memory, you need to bypass many security measures implemented in the OS already. It’s not that simple as it sounds.

@CMDdev

Yeah that’s true, the key would be protected (I was talking about protecting process memory, not your idea of encrypting assets).

Most of that paragraph was me not reading your post closely enough and just ranting on. My apologies. I agree with your general point that even with memory encryption, the jailbroken state of the device means the memory might not be safe, though I'd say that if implemented, memory keys would likely be stored in something akin to a secure enclave.

Anyways I'll add as many pieces of Swiss cheese to my defenses as my time, ability, and the app's performance allows. Anything to slow down the competition.

嗨你是不是還有另一個身分?一個公務員組織的先人跳詐騙集團的身分!周 心琳、周 玉珍 有43塊雞塊嗎

Is the issue of code-theft via decompilation or reverse engineering common for Swift iOS apps? And can I protect a small portion of my code?
 
 
Q