The charging port of my iPhone may be damaged due to water, and it cannot be charged and transmitted data. It can only be charged wirelessly that does not support data transmission. However, since Xcode supports wireless debugging, I can continue to test my App. However, I recently changed to a new Mac, but there is no connection record with the iPhone in the new Mac, which makes it impossible to debug wirelessly.
So I want to know how to realize wireless debugging on such a device without debugging records?
Post
Replies
Boosts
Views
Activity
I am currently preparing my submission for the Swift Student Challenge, and my app playground is quite comprehensive. Based on my estimations, it may take approximately 4 to 5.5 minutes for the reviewers to fully experience the interactive elements of my app. Every component is integral to the overall experience, and I would prefer not to remove any content, as each part not only contributes to the overall interactivity but also effectively demonstrates my abilities across different technical and creative domains.
However, I noticed the guideline on https://developer.apple.com/swift-student-challenge/eligibility stating that the interactive scene should be “experienced within three minutes.” While this does not appear to be a main requirement, my app playground significantly exceeds this timeframe.
Could you kindly clarify whether exceeding the three-minute guideline could result in my submission being rejected, or if it might negatively impact the evaluation process? I would greatly appreciate any insights you can provide.
Thank you for your time and consideration. I look forward to your response.
I am currently filling out the Swift Student Challenge form, and I have two questions that I hope to get clarified:
One of the options asks, “Did you use open source software, other than Swift?” I would like to know what is meant by “open source software” in this context. Does it refer to IDEs (like Xcode) or programming languages and frameworks (such as Python, ARKit)? Are Apple frameworks (e.g., SwiftUI, ARKit, etc.) and certain third-party tools (such as Xcode, Blender, etc.) considered “open source software”?
I would like to provide a demo video to ensure that the reviewers can use the app properly and experience all of its features in the shortest amount of time. For certain reasons, I do not plan to play the video directly in the App Playground. Instead, I intend to include a link in the “Comments” section at the end of the form, which will redirect to a webpage (requiring an internet connection) containing the demo video. Will the reviewers be able to view the link and access the video as intended?
I would greatly appreciate any responses to these questions!
I intend to participate in the Swift Student Challenge. A link is provided within my application that directs users to an Internet HTML web page.
Link(destination: URL(string: "https://url.com")!) {
Label("Developer Website - .....com", systemImage: "arrow.right")
.shadow(color: Color.white ,radius: 50)
}
This URL corresponds to my personal web page. Although it is not directly related to the experience interaction within the application, I have decided to include it as it serves as a logo and demonstrates my proficiency in HTML. However, the challenge’s rules stipulate that the evaluation environment is not permitted to connect to the Internet. Consequently, I am concerned that my work may be rejected due to its perceived incompleteness or errors. So should I keep it? Thanks!
I am submitting a challenge to the Swift Student Challenge. I have created a RealityContent folder using Reality Composer Pro. How can I import this folder into the Swift Package Manager (.swiftpm) project hosted on Playground to ensure that it becomes a usable package?
I am interested in participating in the Swift Student Challenge. My application contains a significant amount of augmented reality (AR) content, necessitating access to the camera. It is evident that if the reviewer utilizes a simulator or operates on a Mac, they will not be able to experience the AR function. Therefore, the AR function in the camera experience application must be utilized to access a real iPad.
However, it is mentioned in https://developer.apple.com/forums/thread/773530 that the plan is to evaluate Xcode app playgrounds within the simulator. Additionally, I observed the statement “Note: Xcode app playgrounds are executed in Simulator” on the submission page. Consequently, it is clear that the reviewers are limited to using a simulator or running my application on a Mac.
In light of this, I am seeking guidance on how to enable the reviewer to utilize a real iPad to access the AR function in the camera experience application. Alternatively, I may need to reconsider my strategy and discontinue utilizing AR.
It was mentioned in the Swift Student Challenge that outstanding winners will have the opportunity to visit Apple Park in the United States. However, as a challenger from China who is not currently in the U.S., this means that if I receive the outstanding award, I will need to apply for a visa to travel to Apple Park. Since I am under 18, my guardian would also need to apply for a visa. Therefore, I would like to know if Apple provides visa assistance for outstanding winners and their guardians from China, or if we are responsible for applying for the visas on our own.
I am interested in learning the Metal framework for rendering development. However, most of Apple’s official documentation uses Objective-C code. Therefore, I am seeking guidance on whether it is more advantageous for me to focus solely on learning Swift to gain proficiency in Metal.
I intend to participate in the Swift Student Challenge 25. I see Rules, It is mentioned that Playgrounds works should be a work that can be experienced in three minutes. However, my work does not meet this requirement.
Create an interactive scene in an app playground that can be experienced within three minutes.
Initially, my work was not intended for the Challenge but for the App Store. However, I decided to submit it to the Challenge, and my work and I met the requirements of the Challenge. Therefore, my work is a complete application, which makes it impossible for the judges to experience it within three minutes. It may take more time. Does this have any impact?
In visionOS, there are existing modifiers that can completely conceal the hands. However, I am interested in learning how to achieve the effect of only one hand disappearing while the other hand remains visible.
.upperLimbVisibility(.hidden)
In macOS, I am encountering an issue where the system API fails to grant permission to open a file despite enabling the necessary Read/Write permissions within the SandBox. Could you please elucidate the reasons behind this behavior? Thanks!
func finderOpenFileSystem(at path: String) {
let fileURL = URL(fileURLWithPath: path)
guard FileManager.default.fileExists(atPath: path) else {
print("Error: File does not exist at path: \(path)")
return
}
let success = NSWorkspace.shared.open(fileURL)
if success {
print("File opened successfully: \(fileURL)")
} else {
print("Error: Failed to open file: \(fileURL)")
}
}
In the fileImporter on iOS, when I select a folder, the result is always a subfile belonging to the private folder. This prevents me from accessing these files within the app. However, I do not wish to store them in the sandbox environment. What steps can I take to resolve this issue?
It appears that there is a size limit when training the Tabular Classification model in CreatML. When the training data is small, the training process completes smoothly after a specified period. However, as the data volume increases, the following issues occur: initially, the training process indicates that it is in progress, but after approximately 24 hours, it is automatically terminated after an hour. I am certain that this is not a manual termination by myself or others, but rather an automatic termination by the machine. This issue persists despite numerous attempts, and the only message displayed is “Training Canceled.” I would appreciate it if someone could explain the reason behind this behavior and provide a solution. Thank you for your assistance.
I am currently training a Tabular Classification model in CreatML. The dataset comprises 30 features, including 1,000,000 training data points and 1,000,000 verification data points. Could you please estimate the approximate training time for an M4Max MacBook Pro?
During the training process, CreatML has been displaying the “Processing” status, but there is no progress bar. I would like to ascertain whether the training is still ongoing, as I have often suspected that it has ceased.
In the particle effect of RealityKit, there is a Type:
ParticleEmitterComponent.Presets
He can invoke certain particle effects in certain systems, but I am interested in learning how to modify these particle entities (such as adjusting the color, the number of particles, the generation range…)?