Is Applebot using a cached robots.txt file?

We are trying to get deep links working with our app in test flight.

When we are trying to validate our website using the tool: https://search.developer.apple.com/appsearch-validation-tool/

It keeps saying that "File is blocked by robots. Please check your url and try again."

Earlier today the robots.txt file was Disallow: / so that makes sense.

My question is, do we know how long Applebot will cache that for? Is there a way to request Applebot fetches the new version?

Any help would be appreciated

Have you found a solution to this? Or was it resolved after certain time?

Is Applebot using a cached robots.txt file?
 
 
Q