I've spent the last few days researching the upcoming laws in Texas and other US states, and how these laws will impact on developers around the world. I want to share what I've learned so far with the community and get feedback on my current understanding. This post is not so much focused on a single API, but more of the bigger picture.
Background
The law essentially mandates that: (1) app store platforms implement age categorization and verification mechanisms, and (2) developers implement logic to listen to age categorization signals provided by the platform and respond accordingly. You can read the law itself here: https://capitol.texas.gov/tlodocs/89R/billtext/html/SB02420S.HTM
Most people seem to be interpreting the law as follows: All developers who distribute apps in the USA are effectively required to implement the new APIs (required by Texas, not by Apple). The penalties are heavy, but it's unclear whether developers would actually be pursued and by whom (e.g. would someone seriously pursue an alarm clock app because it could be accessed by a minor?).
Putting aside the ethical, privacy, and legal issues (and the damaging precedents this law sets), most people seem to agree that, from a technical perspective, this is a very silly way to implement age blocking (app store collects the info and passes it to dev, dev is responsible for blocking access). It would make way more sense for the platform to block the app directly for affected users (with optional API support for developers who wish to use it). However, I believe the law has specifically mandated that this is how they expect the system to work, so Apple's hands have been tied.
Apple has basically complied with their obligations by providing the relevant APIs to developers.
Because the law is vague and open-ended, there are a lot of legal and technical uncertainties about what developers actually need to do to be compliant. Understandably, Apple seems reticent to provide any guidance to developers that could be interpreted as legal advice. Apple's docs simply describe what the APIs do with no guidance on what the overall flow is meant to look like or how and when the APIs should actually be used in practice.
Americans familiar with the political situation seem to think there's the possibility of an injunction before this law goes into effect, but that looks increasingly unlikely given that it's two weeks away.
Developer solutions
Many devs seem to be exploring two main workarounds, at least as temporary solutions: (1) Raise your app's rating to 18+. Putting aside the fact that Texas law would effectively be forcing developers to raise their global age rating (resulting in lost revenue that extends far beyond Texas), it remains unclear whether this solution is actually legally compliant, since the law specifically mandates that apps must implement logic to respond to signals from the platform. (2) Geo-block Texas. Again, it remains unclear if this is compliant because geo-blocking is not 100% accurate and it doesn't actually do what the law says you have to do. It also creates issues if you already have users in Texas, and it means performing additional privacy-hostile checks (i.e., detecting the user's location, even users who are not subject to the law).
The DeclaredAgeRange API is actually pretty straight-forward to use – although there is still a lack of documentation on certain edge cases and it's difficult to test. In addition, the new APIs are only available in iOS 26.2, so it's unclear what you need to do if you're still supporting < iOS 26.2. Some people are of the opinion that developers can only reasonably respond to the signals that are available, thus pushing responsibility back to the platforms in regards to earlier OS versions.
The API provides a bool (AgeRangeService.shared.isEligibleForAgeFeatures), which allows you to determine if the user is someone to whom age checks need to be applied. https://developer.apple.com/documentation/declaredagerange/agerangeservice/iseligibleforagefeatures I'm not 100% sure, but perhaps the simplest action you can take is to check this bool on launch and block access if it's true. In any case, it looks like this API will be very useful because it means we can avoid applying the checks in other jurisdictions and for grandfathered-in users without needing to implement custom geo-tracking code (albeit only in iOS 26.2+).
To implement the API, my current thinking is that, on every launch, I should first check the above bool and, if it's true, do the following: (1) get the App Store age rating with let appStoreAgeRating = await AppStore.ageRatingCode ?? 18, (2) request the user's age with let ageRangeResponse = try await AgeRangeService.shared.requestAgeRange(ageGates: appStoreAgeRating), (3) check that the user has agreed to share their age, (4) check that lowerBound >= appStoreAgeRating, and (5) check that the verification method is not one of the self-declared methods. If this procedure fails, I should block access to the app and provide a link to Apple's support page: https://support.apple.com/en-us/122770 I stress, however, that this is just my current idea and there are some edge cases I'm unsure about.
Other issues
It is possible to do some basic testing of the API, but only using a sandbox App Store account on a physical device. From the Developer section in iOS Settings, you can select from a few different scenarios, like "Texas user aged 14 without parental consent", etc.
There's also a whole separate aspect to this law relating to "significant updates". Everyone seems kinda confused about this, but it seems like the general idea is that, if your app's age classification changes in the future, the app should be responsive to that change. My current interpretation is that if I use the AppStore.ageRatingCode as the age gate (as described above) then that should allow me to comply, but I haven't really looked into this aspect of the law yet.
There's also another aspect to this law requiring developers to revoke access to the app when requested by the parent. I have not looked into this yet, but as noted above, it doesn't make sense to me why this is the developer's responsibility given that the platforms already provide solid parental controls. Do I need to something else in addition to what I've sketched out above?
It goes without saying, of course, that everything above is not legal advice, and I still have some gaps in my understanding.
I would really appreciate any feedback on the above, perhaps with recommendations about better ways to approach this.